Skip to content

Create performance benchmark suite with CI integration #10

@eraschle

Description

@eraschle

🚀 Feature Description

Implement a comprehensive performance benchmark suite to track PyM Core performance over time and detect regressions.

🎯 Benchmark Categories

1. Parameter Operations

  • Parameter access speed (with/without caching)
  • Unit conversion performance
  • Parameter validation time
  • Bulk parameter operations

2. Element Operations

  • Element creation and initialization
  • Container extension performance
  • Serialization/deserialization speed
  • Memory usage patterns

3. Repository Operations

  • Save/load operations (single vs batch)
  • Query performance with different dataset sizes
  • JSON serialization performance
  • Memory usage with large repositories

4. Integration Scenarios

  • Complete railway scene creation
  • Large infrastructure project simulation
  • Multi-threaded access patterns
  • Memory leak detection

🔧 Implementation Plan

Benchmark Framework

```python

Use pytest-benchmark for consistent measurements

import pytest
from pymcore import GenericElement, ElementRepository
from pymapp import Pole

class TestPerformanceBenchmarks:
def test_parameter_access_performance(self, benchmark):
pole = Pole("test_pole")
pole.set_height(5000.0)

    # Benchmark parameter access
    result = benchmark(pole.height)
    assert result == 5000.0

def test_repository_save_performance(self, benchmark):
    repo = ElementRepository()
    elements = [Pole(f\"pole_{i}\") for i in range(1000)]
    
    # Benchmark batch save
    benchmark(repo.save_batch, elements)

```

CI Integration

```yaml

.github/workflows/benchmarks.yml

name: Performance Benchmarks
on: [push, pull_request]

jobs:
benchmark:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup uv
uses: astral-sh/setup-uv@v1
- name: Run benchmarks
run: uv run pytest benchmarks/ --benchmark-json=benchmark.json
- name: Store benchmark results
uses: benchmark-action/github-action-benchmark@v1
with:
tool: 'pytest'
output-file-path: benchmark.json
```

📊 Performance Targets

Baseline Metrics (to establish)

  • Parameter access: < 1ms per operation
  • Element creation: < 10ms per element
  • Repository save: < 100ms per 1000 elements
  • Memory usage: < 1MB per 1000 simple elements

Regression Detection

  • Performance degradation >20% fails CI
  • Memory usage increase >30% triggers warning
  • Trend analysis over multiple commits

✅ Acceptance Criteria

  • Comprehensive benchmark suite covers all major operations
  • CI integration runs benchmarks on every PR
  • Performance regression detection working
  • Baseline metrics established and documented
  • Memory usage benchmarks included
  • Results tracked over time with visualizations
  • Documentation explains how to run and interpret benchmarks

📂 Directory Structure

```
benchmarks/
├── conftest.py # Benchmark fixtures
├── test_parameter_perf.py # Parameter operation benchmarks
├── test_element_perf.py # Element creation/manipulation
├── test_repository_perf.py # Repository operations
├── test_integration_perf.py # End-to-end scenarios
└── utils/
├── data_generators.py # Test data creation
└── memory_profiler.py # Memory usage tracking
```

🎯 Success Metrics

  • Detect performance regressions before they reach main branch
  • Track performance improvements from optimizations
  • Provide data-driven insights for optimization priorities
  • Enable confident performance-related code changes

⏱️ Estimated Effort

4-5 days - Framework setup, benchmark implementation, CI integration, documentation

Metadata

Metadata

Assignees

No one assigned

    Labels

    component/tests🧪 Test suite and testing infrastructurepriority/medium🟡 Nice to have for milestonetype/testing🧪 Test improvements or additions

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions