Skip to content

Per-Middleware Performance Benchmarks #374

@A6dulmalik

Description

@A6dulmalik

Important: All implementation for this issue must be done exclusively in the middleware repository. No modifications should be made to the backend codebase.

#369 ## Description

Measure the latency overhead each middleware adds individually over the baseline. This gives contributors and users a clear cost reference before stacking middleware in production.


🛠 Implementation Hints

  • Reuse the benchmark server from Implement UpdateUserService and Integrate into UsersService #21; mount each middleware independently per test run
  • Run each benchmark for at least 10 seconds with a warmup period of 2 seconds to avoid cold-start noise
  • Calculate overhead as (middleware_p99 - baseline_p99) and surface it clearly in output
  • Automate across all middleware using a loop over exported middleware from src/index.ts
  • Write results into docs/PERFORMANCE.md in a markdown table

Deliverables

  • benchmarks/middleware/ — one benchmark file per middleware
  • Overhead delta reported per middleware
  • Results table in docs/PERFORMANCE.md

Acceptance Criteria

Metadata

Metadata

Assignees

Labels

Stellar WaveIssues in the Stellar wave program

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions