Iterating Over Large Datasets and Total Record Count #76
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Changes Made
scrollAcrossEntitiesandaggregateAcrossEntitiesGraphQL APIs, which are essential for managing large datasets and understanding the scale of data within DataHub.graphqlAPI ser...: The documentation should include detailed sections on iterating over large datasets and determining the total number of records using the GraphQL API. This addition provides users with clear instructions on managing large datasets and understanding the scale of data within DataHub.scrollAcrossEntitiesandaggregateAcrossEntitiesGraphQL APIs, which are essential for handling large datasets and understanding the scale of data within DataHub.scrollAcrossEntitiesandaggregateAcrossEntitiesGraphQL APIs, providing example queries, and guidance on handling results.