Skip to content

JSON RPC 2.0 Batch support + fix broken Streamable HTTP transport#56

Open
RomanEmreis wants to merge 8 commits intomainfrom
feature/json-msg-batching
Open

JSON RPC 2.0 Batch support + fix broken Streamable HTTP transport#56
RomanEmreis wants to merge 8 commits intomainfrom
feature/json-msg-batching

Conversation

@RomanEmreis
Copy link
Owner

Summary

Implements JSON-RPC 2.0 batch request/response support (spec §6) on both the server and client sides of
Neva, plus a complete examples/batch/ workspace demonstrating the feature.

What Changed

Core Types (neva/src/types.rs)

  • Added MessageBatch — a non-empty wrapper around Vec<MessageEnvelope> enforcing the spec's "batch must contain at least one object" rule at
    construction and deserialization
  • Added Message::Batch(MessageBatch) variant with #[serde(untagged)] ordering: Batch is last so single-message deserialization is tried first
  • MessageBatch implements IntoIterator for ergonomic iteration over envelopes

Server (neva/src/app.rs)

  • Handles incoming Message::Batch by processing each request concurrently and collecting responses into a single batch reply
  • Maintains JSON-RPC ordering: responses are returned in the same order as requests

Client Handler (neva/src/client/handler.rs)

  • Receive loop now handles Message::Batch: iterates envelopes and dispatches each Response to its waiting channel via pending.complete(resp);
    non-response envelopes in a batch (protocol violations per spec) are silently ignored for robustness
  • Added send_batch — registers pending slots for all requests, sends the batch atomically, and cleans up slots on send failure to prevent leaks
  • Added timeout() / pending() accessors needed by Client::call_batch

Client (neva/src/client.rs + new neva/src/client/batch.rs)

  • Client::call_batch(items: Vec<MessageEnvelope>) -> Result<Vec<Response>> — sends a batch and awaits all responses concurrently using try_join_all
    (short-circuits on first error)
  • Client::batch() -> BatchBuilder<'_> — entry point for the fluent builder API
  • BatchBuilder<'a> with 9 builder methods:
Method Description
list_tools() Add a tools/list request
call_tool(name, args) Add a tools/call request (sets meta for progress tracking)
list_resources() Add a resources/list request
read_resource(uri) Add a resources/read request
list_resource_templates() Add a resources/templates/list request
list_prompts() Add a prompts/list request
get_prompt(name, args) Add a prompts/get request (sets meta for progress tracking)
ping() Add a ping request
notify(method, params) Add a notification (fire-and-forget, excluded from responses)
send() Flush the batch and await all responses

Example (examples/batch/)

A self-contained workspace (client + server) demonstrating batch in action:

  • Server: exposes an add tool, a greeting prompt, and three notes resources (daily, weekly, monthly)
  • Client: sends one batch covering all capability types — list_tools, list_resources, list_prompts, call_tool, read_resource × 2,
    get_prompt, ping — then parses and logs each response
let responses = client
    .batch()
    .list_tools()
    .list_resources()
    .list_prompts()
    .call_tool("add", [("a", 40_i32), ("b", 2_i32)])
    .read_resource("notes://daily")
    .read_resource("notes://weekly")
    .get_prompt("greeting", [("name", "Neva")])
    .ping()
    .send()
    .await?;

Other

  • Fixed broken Streamable HTTP server flow (response flushing regression)
  • Graceful error on MCP protocol version mismatch (was a panic)
  • OSS community files: CONTRIBUTING.md, CODE_OF_CONDUCT.md, SECURITY.md, issue/PR templates, cleaned-up .gitignore
  • Dependency updates (neva/Cargo.toml, neva_macros/Cargo.toml)

Test Coverage

  • handler.rs: batch_responses_are_distributed_individually — verifies mixed-envelope batch (Response + Request + Response) dispatches only the two
    responses to their pending channels
  • client.rs: call_batch_returns_error_when_disconnected — verifies early error when no handler is attached
  • All existing tests pass (54 total, 0 failures, clippy clean)

Running the Example

# Terminal 1 — server
cargo run -p server --manifest-path examples/batch/Cargo.toml

# Terminal 2 — client
cargo run -p client --manifest-path examples/batch/Cargo.toml

@RomanEmreis RomanEmreis self-assigned this Mar 8, 2026
@RomanEmreis RomanEmreis added bug Something isn't working enhancement New feature or request performance feature labels Mar 8, 2026
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 4ec251d795

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@RomanEmreis
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 33756e8b8e

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@RomanEmreis
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b7698cb817

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@RomanEmreis
Copy link
Owner Author

@codex review

@chatgpt-codex-connector
Copy link

You have reached your Codex usage limits for code reviews. You can see your limits in the Codex usage dashboard.
To continue using code reviews, you can upgrade your account or add credits to your account and enable them for code reviews in your settings.

@RomanEmreis RomanEmreis changed the title JSON RPC Batch JSON RPC 2.0 Batch support + fix broken Streamable HTTP transport Mar 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working enhancement New feature or request feature performance

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant