User Story
As a developer optimizing Pulse performance, I want a NetEvolve.Pulse.Benchmarks project with BenchmarkDotNet, so that I can measure dispatch latency, interceptor overhead, and outbox throughput with reproducible, CI-runnable benchmarks.
Requirements
- Create �enchmarks/NetEvolve.Pulse.Benchmarks/NetEvolve.Pulse.Benchmarks.csproj:
- Target framework:
et9.0.
- Dependencies: BenchmarkDotNet, NetEvolve.Pulse (project reference), NetEvolve.Pulse.EntityFramework (project reference).
- Program.cs: BenchmarkSwitcher.FromAssembly(typeof(Program).Assembly).Run(args).
- Add to solution.
- Benchmark classes:
- CommandDispatchBenchmarks: SendAsync with 0/1/3/5 interceptors.
- QueryDispatchBenchmarks: QueryAsync cached vs. uncached.
- EventDispatchBenchmarks: sequential 1/5 handlers, parallel 5 handlers, with EventFilterInterceptor.
- StreamQueryBenchmarks: 10/100 items with/without logging interceptor; measures time-to-first-item and throughput.
- InterceptorOverheadBenchmarks: [Params(0,1,3,5,10)] interceptors, no-op handler.
- OutboxThroughputBenchmarks: EF Core InMemory + SQLite in-memory, batch sizes 1/10/100.
- RegistrationBenchmarks: manual vs. source-gen registration startup time, 10/50/100 handlers.
- Add .github/workflows/benchmarks.yml: trigger on �enchmark label or workflow_dispatch; run dotnet run -c Release; post Markdown results as PR comment.
Acceptance Criteria
User Story
As a developer optimizing Pulse performance, I want a NetEvolve.Pulse.Benchmarks project with BenchmarkDotNet, so that I can measure dispatch latency, interceptor overhead, and outbox throughput with reproducible, CI-runnable benchmarks.
Requirements
et9.0.
Acceptance Criteria