Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 36 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,40 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## [Unreleased]

## [0.10.0] - 2026-03-01

### Added

- NumPy array support via `NumberArray` class
- Vectorized arithmetic with unit tracking and uncertainty propagation
- Vectorized conversion: `heights.to(units.foot)`
- Reduction operations: `sum()`, `mean()`, `std()`, `min()`, `max()`
- Comparison operators returning boolean arrays for filtering
- N-D array support with broadcasting
- Callable syntax: `units.meter([1, 2, 3])` returns `NumberArray`
- Pandas integration via `NumberSeries` and `UconSeriesAccessor`
- `df['height'].ucon.with_unit(units.meter).to(units.foot)`
- Arithmetic preserves unit semantics
- Polars integration via `NumberColumn`
- Wraps `pl.Series` with unit metadata
- `.to()` conversion with unit tracking
- Map array support for vectorized operations
- `LinearMap`, `AffineMap`, `LogMap`, `ExpMap` work with numpy arrays
- `_log()` and `_exp()` helpers for scalar/array compatibility
- Optional dependencies: `ucon[numpy]`, `ucon[pandas]`, `ucon[polars]`
- Performance caching for repeated operations
- Conversion path caching in `ConversionGraph`
- Scale factor caching for scale-only conversions
- Unit multiplication/division caching
- `fold_scale()` result caching on `UnitProduct`
- Performance benchmarks: `make benchmark`, `make benchmark-pint`
- Documentation: `docs/guides/numpy-arrays.md`, `docs/guides/pandas-integration.md`, `docs/guides/polars-integration.md`
- Example notebook: `examples/scientific_computing.ipynb`

### Changed

- `Unit.__call__` and `UnitProduct.__call__` return `NumberArray` when given list or ndarray input

## [0.9.4] - 2026-02-28

### Changed
Expand Down Expand Up @@ -454,7 +488,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Initial commit

<!-- Links -->
[Unreleased]: https://github.com/withtwoemms/ucon/compare/0.9.4...HEAD
[Unreleased]: https://github.com/withtwoemms/ucon/compare/0.10.0...HEAD
[0.10.0]: https://github.com/withtwoemms/ucon/compare/0.9.4...0.10.0
[0.9.4]: https://github.com/withtwoemms/ucon/compare/0.9.3...0.9.4
[0.9.3]: https://github.com/withtwoemms/ucon/compare/0.9.2...0.9.3
[0.9.2]: https://github.com/withtwoemms/ucon/compare/0.9.1...0.9.2
Expand Down
21 changes: 21 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ DEPS_INSTALLED := ${UV_VENV}/.deps-installed
TESTDIR := tests/
TESTNAME ?=
COVERAGE ?= true
BENCH_SIZES ?= 1000 10000 100000
BENCH_ITERATIONS ?= 50

# --- Color Setup ---
GREEN := \033[0;32m
Expand All @@ -35,12 +37,16 @@ help:
@echo " ${CYAN}clean${RESET} - Remove build artifacts and caches"
@echo " ${CYAN}stubs${RESET} - Generate dimension.pyi type stubs"
@echo " ${CYAN}stubs-check${RESET} - Verify stubs are current (for CI)"
@echo " ${CYAN}benchmark${RESET} - Run array performance benchmarks"
@echo " ${CYAN}benchmark-pint${RESET} - Run benchmarks with pint comparison"
@echo ""
@echo "${YELLOW}Variables:${RESET}\n"
@echo " PYTHON=${PYTHON} - Python version for test target"
@echo " UV_VENV=${UV_VENV} - Path to virtual environment"
@echo " TESTNAME= - Specific test to run (e.g., tests.ucon.test_core)"
@echo " COVERAGE=${COVERAGE} - Enable coverage (true/false)"
@echo " BENCH_SIZES=${BENCH_SIZES} - Array sizes for benchmarks"
@echo " BENCH_ITERATIONS=${BENCH_ITERATIONS} - Iterations per benchmark"
@echo ""

# --- uv Installation ---
Expand Down Expand Up @@ -163,3 +169,18 @@ stubs-check: ${DEPS_INSTALLED}
@echo "${GREEN}Verifying dimension stubs are current...${RESET}"
@UV_PROJECT_ENVIRONMENT=${UV_VENV} uv run --python ${PYTHON} \
python scripts/generate_dimension_stubs.py --check

# --- Benchmarks ---
.PHONY: benchmark
benchmark: ${DEPS_INSTALLED}
@echo "${GREEN}Running array performance benchmarks...${RESET}"
@UV_PROJECT_ENVIRONMENT=${UV_VENV} uv run --python ${PYTHON} \
python benchmarks/array_operations.py --sizes $(BENCH_SIZES) --iterations $(BENCH_ITERATIONS)

.PHONY: benchmark-pint
benchmark-pint: ${DEPS_INSTALLED}
@echo "${GREEN}Installing pint for comparison...${RESET}"
@uv pip install pint --python ${UV_VENV}/bin/python 2>/dev/null || true
@echo "${GREEN}Running benchmarks with pint comparison...${RESET}"
@UV_PROJECT_ENVIRONMENT=${UV_VENV} uv run --python ${PYTHON} \
python benchmarks/array_operations.py --sizes $(BENCH_SIZES) --iterations $(BENCH_ITERATIONS) --with-pint
25 changes: 23 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,9 @@ pip install ucon
With extras:

```bash
pip install ucon[numpy] # NumPy array support
pip install ucon[pandas] # Pandas DataFrame integration
pip install ucon[polars] # Polars DataFrame integration
pip install ucon[pydantic] # Pydantic v2 integration
pip install ucon-tools[mcp] # MCP server for AI agents (separate package)
```
Expand Down Expand Up @@ -92,6 +95,22 @@ speed(units.meter(100), units.second(10)) # <10.0 m/s>
speed(units.second(100), units.second(10)) # raises ValueError
```

### NumPy Arrays

```python
from ucon import units

# Vectorized operations on arrays
heights = units.meter([1.7, 1.8, 1.9, 2.0])
heights_ft = heights.to(units.foot) # <[5.577, 5.906, 6.234, 6.562] ft>

# Arithmetic with unit tracking
areas = heights * units.meter([2, 2, 2, 2]) # m^2

# Statistical reductions preserve units
avg = heights.mean() # <1.85 m>
```

### Pydantic Integration

```python
Expand Down Expand Up @@ -131,6 +150,8 @@ AI agents can then convert units, check dimensions, and perform factor-label cal

## Features

- **NumPy arrays** — Vectorized operations with `NumberArray` for batch computations
- **Pandas/Polars** — Unit-aware DataFrames with `NumberSeries` and `NumberColumn`
- **Physical constants** — CODATA 2022 values with uncertainty propagation (`E = m * c**2`)
- **Custom constants** — Define domain-specific constants with uncertainty propagation
- **String parsing** — `parse("9.81 m/s^2")` with uncertainty support (`1.234 ± 0.005 m`)
Expand All @@ -157,7 +178,7 @@ AI agents can then convert units, check dimensions, and perform factor-label cal
| **0.7.x** | Compute Tool + Extension API | Complete |
| **0.8.x** | Basis Abstraction + String Parsing | Complete |
| **0.9.x** | Constants + Natural Units | Complete |
| **0.10.x** | NumPy/Polars Integration | Planned |
| **0.10.x** | NumPy/Pandas/Polars Integration | Current |
| **1.0.0** | API Stability | Planned |

See full roadmap: [ROADMAP.md](https://github.com/withtwoemms/ucon/blob/main/ROADMAP.md)
Expand All @@ -169,7 +190,7 @@ See full roadmap: [ROADMAP.md](https://github.com/withtwoemms/ucon/blob/main/ROA
| Section | Description |
|---------|-------------|
| [Getting Started](https://docs.ucon.dev/getting-started/) | Why ucon, quickstart, installation |
| [Guides](https://docs.ucon.dev/guides/) | MCP server, Pydantic, custom units, dimensional analysis |
| [Guides](https://docs.ucon.dev/guides/) | NumPy/Pandas/Polars, MCP server, Pydantic, custom units |
| [Reference](https://docs.ucon.dev/reference/) | API docs, unit tables, MCP tool schemas |
| [Architecture](https://docs.ucon.dev/architecture/) | Design principles, ConversionGraph, comparison with Pint |

Expand Down
56 changes: 40 additions & 16 deletions ROADMAP.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,14 +48,33 @@ ucon is a dimensional analysis library for engineers building systems where unit
| v0.9.2 | MCP Constants Tools | Complete |
| v0.9.3 | Natural Units + MCP Session Fixes | Complete |
| v0.9.4 | MCP Extraction | Complete |
| v0.10.0 | Scientific Computing | Planned |
| v0.10.0 | Scientific Computing | Complete |
| v1.0.0 | API Stability | Planned |

---

## Current Version: **v0.9.4** (complete)

Building on v0.9.3 baseline:
## Current Version: **v0.10.0** (complete)

Building on v0.9.4 baseline:
- `ucon.numpy` (`NumberArray` class for vectorized operations on dimensioned arrays)
- `ucon.pandas` (`NumberSeries` wrapper and `UconSeriesAccessor` for Pandas integration)
- `ucon.polars` (`NumberColumn` wrapper for Polars integration)
- NumPy array support: `units.meter([1, 2, 3])` → `NumberArray`
- Vectorized conversion: `heights.to(units.foot)` on arrays
- Vectorized arithmetic: add, sub, mul, div with unit tracking
- Per-element and uniform uncertainty propagation through arrays
- Reduction operations: `sum()`, `mean()`, `std()`, `min()`, `max()` with uncertainty
- Comparison operators returning boolean arrays for filtering
- N-D array support (not just 1D)
- Broadcasting support for compatible shapes
- Pandas accessor: `df['height'].ucon.with_unit(units.meter).to(units.foot)`
- Polars column wrapper: `NumberColumn(series, unit=units.meter)`
- Optional dependencies: `pip install ucon[numpy]`, `ucon[pandas]`, `ucon[polars]`
- Graceful degradation: ImportError with clear messages when deps not installed
- Performance caching: conversion paths, scale factors, unit multiplication
- Performance benchmarks: `make benchmark`, `make benchmark-pint`

Previous versions include:
- `ucon.basis` (`Basis`, `BasisComponent`, `Vector`, `BasisTransform`, `BasisGraph`, `ConstantAwareBasisTransform`)
- `ucon.bases` (standard bases: `SI`, `CGS`, `CGS_ESU`, `NATURAL`; standard transforms including `SI_TO_NATURAL`)
- `ucon.dimension` (`Dimension` as frozen dataclass backed by basis-aware `Vector`)
Expand Down Expand Up @@ -675,25 +694,29 @@ Prerequisite for factor-label chains with countable items (tablets, doses).

---

## v0.10.0 — Scientific Computing
## v0.10.0 — Scientific Computing (Complete)

**Theme:** NumPy and DataFrame integration.

- [ ] `Number` wraps `np.ndarray` values
- [ ] Vectorized conversion and arithmetic
- [ ] Vectorized uncertainty propagation
- [ ] Polars integration: `NumberColumn` type
- [ ] Pandas integration: `NumberSeries` type
- [ ] Column-wise conversion
- [ ] Unit-aware arithmetic on columns
- [ ] Performance benchmarks
- [x] `NumberArray` class with vectorized arithmetic and conversion
- [x] Scalar and per-element uncertainty propagation
- [x] Reduction operations: `sum()`, `mean()`, `std()`, `min()`, `max()`
- [x] Comparison operators returning boolean arrays
- [x] Callable syntax: `meter([1, 2, 3])` → `NumberArray`
- [x] Map array support (`LinearMap`, `AffineMap`, `LogMap`, `ExpMap`)
- [x] Pandas integration: `NumberSeries` and `UconSeriesAccessor`
- [x] Polars integration: `NumberColumn`
- [x] Optional dependencies: `ucon[numpy]`, `ucon[pandas]`, `ucon[polars]`
- [x] Performance caching (conversion paths, scale factors, unit products)
- [x] Benchmarks: `make benchmark`, `make benchmark-pint`

**Outcomes:**
- Seamless integration with NumPy-based scientific workflows
- Efficient batch conversions for large datasets
- First-class support for data science workflows
- Efficient batch conversions for large datasets (100M+ elements/sec)
- First-class support for data science workflows with Pandas and Polars
- Unit-safe transformations on tabular data
- Performance characteristics documented and optimized
- Performance competitive with pint, much faster on creation
- Lazy caching approximates fixed registry without startup cost

---

Expand All @@ -718,6 +741,7 @@ Prerequisite for factor-label chains with countable items (tablets, doses).

| Feature | Notes |
|---------|-------|
| Cache Warming | `warm_cache()` API for precomputing common conversion paths |
| Decompose Tool | SLM enablement: deterministic `decompose` → `compute` pipeline |
| Uncertainty correlation | Full covariance tracking |
| Cython optimization | Performance parity with unyt |
Expand Down
6 changes: 6 additions & 0 deletions docs/guides/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,12 @@ Task-oriented guides for common use cases.
- **[Pydantic Integration](pydantic-integration.md)** - Type-safe dimensional fields in Pydantic models
- **[Config Safety](dimensional-safety-config.md)** - Dimensional safety for configuration files

## Scientific Computing

- **[NumPy Arrays](numpy-arrays.md)** - Vectorized operations with `NumberArray`
- **[Pandas Integration](pandas-integration.md)** - Unit-aware DataFrames with `NumberSeries`
- **[Polars Integration](polars-integration.md)** - Unit-aware Polars with `NumberColumn`

## Calculation Guides

- **[Dimensional Analysis](dimensional-analysis.md)** - Step-by-step factor-label calculations
Expand Down
Loading