Skip to content

Update batching+concurrency docs#1516

Merged
lkasinathan merged 3 commits intomainfrom
lakshmi/batching-with-concurrency-keys
Apr 7, 2026
Merged

Update batching+concurrency docs#1516
lkasinathan merged 3 commits intomainfrom
lakshmi/batching-with-concurrency-keys

Conversation

@lkasinathan
Copy link
Copy Markdown
Contributor

Right now, it says the following, all of which is categorically untrue.

However, the concurrency key option is ignored when batching is enabled. This means:

  • If you configure concurrency: { limit: 1, key: "event.data.user_id" } with batching, the key expression will have no effect
  • Batches will be processed one at a time globally (based on the limit), not per unique key value
  • The concurrency limit applies to all batches regardless of what key values the events contain

@vercel
Copy link
Copy Markdown

vercel bot commented Apr 6, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
website Ready Ready Preview, Comment Apr 7, 2026 0:29am

Request Review

@lkasinathan lkasinathan merged commit 7a790c9 into main Apr 7, 2026
7 of 8 checks passed
@lkasinathan lkasinathan deleted the lakshmi/batching-with-concurrency-keys branch April 7, 2026 02:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants