Skip to content

Configure grpc max dump history#850

Open
nicolasLuduena wants to merge 2 commits intotxpipe:mainfrom
nicolasLuduena:max-dump-history-configurable
Open

Configure grpc max dump history#850
nicolasLuduena wants to merge 2 commits intotxpipe:mainfrom
nicolasLuduena:max-dump-history-configurable

Conversation

@nicolasLuduena
Copy link
Member

@nicolasLuduena nicolasLuduena commented Jan 28, 2026

This pull request introduces a configurable limit for the maximum number of history items that can be dumped via the gRPC sync service, replacing the previous hardcoded value. The configuration can now be set via the GrpcConfig struct, and the relevant logic and tests have been updated accordingly.

Configuration Improvements:

  • Added an optional max_dump_history_items field to the GrpcConfig struct, allowing the maximum number of dump history items to be set via configuration.
  • Updated the gRPC server startup logic in src/serve/grpc/mod.rs to read the new configuration value, defaulting to 100 if not set, and to pass it to the sync service implementation.

Sync Service Logic Updates:

  • Removed the hardcoded MAX_DUMP_HISTORY_ITEMS constant in src/serve/grpc/sync.rs and instead made the limit a field of the SyncServiceImpl struct, set via its constructor.
  • Updated the validation logic in the sync service to use the configurable max_history_items value when checking requests, and improved the error message to reflect the dynamic limit.

Test Adjustments:

  • Updated tests in src/serve/grpc/sync.rs to use the new constructor for SyncServiceImpl that accepts the configurable limit, and to validate the new behavior.

Summary by CodeRabbit

New Features

  • Added configuration option for the gRPC service's maximum history retention limit, allowing users to customize the setting with a default of 100 items.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link

coderabbitai bot commented Jan 28, 2026

📝 Walkthrough

Walkthrough

A new optional configuration field enables per-instance customization of the gRPC dump history limit, replacing the previous hard-coded constant. The value defaults to 100 when not specified and propagates from initialization through the SyncServiceImpl struct.

Changes

Cohort / File(s) Summary
gRPC Dump History Configurability
crates/core/src/config.rs, src/bin/dolos/init.rs, src/serve/grpc/mod.rs, src/serve/grpc/sync.rs
Added max_dump_history_items: Option<u32> field to GrpcConfig; threaded the configuration value through initialization (defaulting to 100); stored it as an instance field in SyncServiceImpl and replaced hard-coded MAX_DUMP_HISTORY_ITEMS constant usage with the configurable instance value in validation logic.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~12 minutes

Poem

🐰 A limit once carved in stone,
Now hops freely, fully own'd—
Config fields bloom, green and bright,
Threading values left and right! 🌱

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 28.57% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The pull request title 'Configure grpc max dump history' clearly and directly summarizes the main change—making the gRPC maximum dump history configurable instead of hardcoded.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants