A lightweight batch processing framework for .NET, inspired by Spring Batch.
📖 Full Documentation — guides, API reference, and examples.
NBatch gives you a declarative, step-based pipeline for ETL jobs, data migrations, and scheduled tasks. Wire up readers, processors, and writers — NBatch handles chunking, error skipping, progress tracking, and restart-from-failure.
| Package | Description |
|---|---|
NBatch |
Core framework — interfaces, chunking, skip policies, CsvReader, DbReader/DbWriter, DI, hosted service |
NBatch.EntityFrameworkCore |
EF Core job store for restart-from-failure (SQL Server, PostgreSQL, SQLite, MySQL/MariaDB) |
dotnet add package NBatch
dotnet add package NBatch.EntityFrameworkCore # only if you need persistent job trackingvar job = Job.CreateBuilder("ETL")
.AddStep("extract-transform", step => step
.ReadFrom(new CsvReader<Order>(...))
.WriteTo(new DbWriter<Order>(...))
.WithChunkSize(100))
.AddStep("notify", step => step
.Execute(() => SendEmail()))
.Build();var job = Job.CreateBuilder("csv-import")
.UseJobStore(connStr, DatabaseProvider.SqlServer) // optional — enables restart-from-failure
.AddStep("import", step => step
.ReadFrom(new CsvReader<Product>("data.csv", mapFn)
.ProcessWith(p => new Product { Name = p.Name.ToUpper(), Price = p.Price })
.WriteTo(new DbWriter<Product>(dbContext))
.WithSkipPolicy(SkipPolicy.For<FormatException>(maxSkips: 5))
.WithChunkSize(100))
.AddStep("notify", step => step
.Execute(() => SendEmail()))
.Build();
await job.RunAsync();builder.Services.AddNBatch(nbatch =>
{
nbatch.AddJob("csv-import", (sp, job) => job
.AddStep("import", step => step
.ReadFrom(new CsvReader<Product>("data.csv", mapFn))
.WriteTo(new DbWriter<Product>(sp.GetRequiredService<AppDbContext>()))
.WithChunkSize(100)))
.RunEvery(TimeSpan.FromHours(1));
});- Chunk-oriented processing — read, transform, and write in configurable batches
- Skip policies — skip malformed records instead of aborting the job
- Restart from failure — SQL-backed job store resumes where a crashed job left off
- Tasklet steps — fire-and-forget work (send an email, call an API, run a stored proc)
- Lambda-friendly — processors and writers can be plain lambdas; no extra classes needed
- DI & hosted service —
AddNBatch(),RunOnce(),RunEvery()for background jobs - Multi-target — .NET 8, .NET 9, and .NET 10
- Provider-agnostic — SQL Server, PostgreSQL, SQLite, or MySQL for the job store; any EF Core provider for your data
See the full documentation for guides, API reference, and examples:
- Core concepts — jobs, steps, readers, writers, processors
- Readers & writers —
CsvReader,DbReader,DbWriter,FlatFileItemWriter - Skip policies — error handling and skip limits
- Job store — persistent tracking and restart-from-failure
- DI & hosted service —
AddNBatch(),IJobRunner,RunOnce(),RunEvery() - Listeners — job and step lifecycle hooks
- API reference — all public types and methods
- Examples — CSV-to-DB, DB-to-file, multi-step, DI, hosted service
# Start the test database (SQL Server via Docker)
cd src && docker compose up -d
# Build & run the demo console app
dotnet build
dotnet run --project NBatch.ConsoleApp
# Run tests
dotnet test- Fork the repo
- Create a feature branch:
git checkout -b my-feature - Commit your changes:
git commit -m "Add my feature" - Push:
git push origin my-feature - Open a pull request
See LICENSE for details.