-
-
Notifications
You must be signed in to change notification settings - Fork 323
chore(deps): bump aiohttp from 3.12.9 to 3.13.3 #5396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
--- updated-dependencies: - dependency-name: aiohttp dependency-version: 3.13.3 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com>
|
Important Review skippedBot user detected. To trigger a single review, invoke the You can disable this status message by setting the Comment |
📊 Monthly LeaderboardHi @dependabot[bot]! Here's how you rank for January 2026: 🎉 Welcome! This is your first contribution this month. Top 3 Contributors
Leaderboard based on contributions in January 2026. Keep up the great work! 🚀 |
| atproto = "^0.0.64" | ||
| django-redis = "^5.4.0" | ||
| uvicorn = "^0.34.0" | ||
| channels = "^4.2.2" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: The aiohttp upgrade introduces a 32 MiB decompression limit. download_and_extract_zip doesn't handle this, causing crashes when downloading large compressed repository ZIPs from GitHub.
Severity: CRITICAL | Confidence: High
🔍 Detailed Analysis
The upgrade to aiohttp version 3.13.3 introduces a default 32 MiB decompression limit. The download_and_extract_zip method in website/consumers.py uses a default aiohttp.ClientSession() to download compressed repository ZIP files from GitHub. When response.read() is called on a response for a large repository, the decompressed data can exceed this limit, raising a ClientPayloadError. This exception is not handled correctly, causing the WebSocket connection to crash and breaking the repository similarity analysis feature for users analyzing large repositories.
💡 Suggested Fix
In website/consumers.py, when creating the aiohttp.ClientSession, increase the max_response_buffer_size to a larger value. Alternatively, refactor the download logic to stream the response body instead of reading it all into memory with response.read(). This will prevent ClientPayloadError exceptions when downloading large compressed files.
🤖 Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.
Location: pyproject.toml#L58
Potential issue: The upgrade to `aiohttp` version 3.13.3 introduces a default 32 MiB
decompression limit. The `download_and_extract_zip` method in `website/consumers.py`
uses a default `aiohttp.ClientSession()` to download compressed repository ZIP files
from GitHub. When `response.read()` is called on a response for a large repository, the
decompressed data can exceed this limit, raising a `ClientPayloadError`. This exception
is not handled correctly, causing the WebSocket connection to crash and breaking the
repository similarity analysis feature for users analyzing large repositories.
Did we get this right? 👍 / 👎 to inform future reviews.
Reference ID: 8144368
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot mergewill merge this PR after your CI passes on it@dependabot squash and mergewill squash and merge this PR after your CI passes on it@dependabot cancel mergewill cancel a previously requested merge and block automerging@dependabot reopenwill reopen this PR if it is closed@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditionswill show all of the ignore conditions of the specified dependency@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)