-
-
Notifications
You must be signed in to change notification settings - Fork 13
feat: handle GitHub rate limits with automatic batching #125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
- Add MAX_COMMENTS_PER_REVIEW constant (100) to stay within GitHub's limits - Implement batchComments() to split suggestions into batches of 100 - Implement createBatchReviewBody() to add informative messages about batching - Add isRateLimitError() helper to detect rate limit errors (429/403) - Update run() to create multiple reviews when needed - Add graceful handling of rate limit errors with reset time logging - Continue processing batches even if one fails with 422 error - Add comprehensive tests for rate limiting (14 new test cases) - Update README with rate limiting documentation Co-authored-by: parkerbxyz <17183625+parkerbxyz@users.noreply.github.com>
- Change successfulComments from counter to array to track actual successful comments - Fix returned comments to include only those from successful batches - Extract helper functions createMockComments() and createMockDiff() to reduce test duplication - Add test to verify correct tracking of successful comments when middle batch fails - Improve code maintainability and correctness Co-authored-by: parkerbxyz <17183625+parkerbxyz@users.noreply.github.com>
|
@copilot Does Octokit already handle some or all of this? Please check the Octokit.js docs with Context7. |
Yes, Octokit.js has built-in rate limit handling through the What Octokit handles automatically:
What our implementation adds:
The per-review comment limit (~100) is a different constraint than API rate limits (5,000 requests/hour). Octokit handles the latter, but we need custom batching for the former since a single Our implementation complements Octokit's rate limiting rather than duplicating it. |
@copilot Is this an actual limit or just a guess? |
It's an undocumented but widely observed limit. Based on my research: Evidence:
Why 100:
Conservative approach: |
@copilot Posting review comments without a reason ( |
Unfortunately, that approach won't work with the GitHub API. While the web UI allows incremental pending review comments, the API has a critical limitation: API Limitation:
Why our approach is necessary:
Alternative considered: Our batching approach is the most practical solution given current API constraints. The batch messaging helps users understand why multiple reviews appear. |
GitHub limits pull request reviews to ~100 comments. Large PRs with many linter/formatter fixes exceed this limit, causing silent comment drops or review failures.
Changes
Batching logic
MAX_COMMENTS_PER_REVIEW)Error handling
User experience
Testing
createMockComments,createMockDiff) to reduce duplicationExample
With 250 suggestions, the action creates 3 reviews:
If review 2 fails with 422, reviews 1 and 3 succeed, returning 150 comments total.
Warning
Firewall rules blocked me from connecting to one or more addresses (expand for details)
I tried to connect to the following addresses, but was blocked by firewall rules:
https://api.github.com/repos/OSGeo/grass-addons/pulls/1458/reviewscurl -s -H Accept: application/vnd.github.v3+json REDACTED(http block)If you need me to access, download, or install something from one of these locations, you can either:
Original prompt
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.