Skip to content

Releases: SwiftWing21/agentome

v0.4.0b1 — sync with helix-context 0.4.0b1 + packet-mode re-exports

18 Apr 21:54

Choose a tag to compare

Sync with helix-context 0.4.0b1

Agentome is a thin identity wrapper — this release tracks
helix-context 0.4.0b1 and exposes the new packet-mode surface at
Agentome's top level.

What's new for Agentome consumers

Top-level packet-mode re-exports

from agentome import ContextPacket, ContextItem, RefreshTarget

Shipped in helix-context 0.4.0b1, now directly importable from
agentome. No reaching for `helix_context` required.

New submodule: `agentome.context_packet`

from agentome.context_packet import build_context_packet, get_refresh_targets

Mirrors the helix-context path form.

Pathway-identity reframe

Description updated: Coordinate index layer for agent context —
Agentome weighs, doesn't retrieve.
Composes on top of the bundled
SQLite genome today, stacks on any content store tomorrow.

Mirrored extras

Every helix-context extra is now mirrored on agentome so
`pip install agentome[launcher]` == `pip install
helix-context[launcher]`. Previously undeclared extras were
silently skipped; now they resolve correctly.

Extra Enables
`accel` orjson
`embeddings` numpy, sentence-transformers, torch
`cpu` spaCy NER
`mcp` MCP SDK for `python -m helix_context.mcp_server`
`nli` standalone torch + transformers
`otel` opentelemetry stack
`launcher` / `launcher-native` / `launcher-tray` supervisor
`ast` tree-sitter + 4 language grammars
`scorerift` ScoreRift bridge
`codec` Headroom semantic compression (recommended)
`all` full feature surface

README v2

  • Full dependency matrix (core + extras table with per-extra
    purpose + non-pip runtime deps)
  • Recommended install profiles (minimal / MCP host / daily dev /
    observability / CI)
  • Headroom composition section explaining `[codec]` as the
    recommended compression path

Install

```bash
pip install agentome # core
pip install agentome[codec,mcp] # agent host with Headroom compression
pip install agentome[all,launcher] # daily driver
```

Migration

  • `from agentome import AgentomeManager, AgentomeConfig` — legacy
    aliases still work. Prefer `HelixContextManager` + `HelixConfig`
    in new code.
  • Existing consumers: nothing breaks. Packet-mode re-exports are
    purely additive.

Powered by Agentome.

v0.3.0b3 — Sync with Helix Context v0.3.0b3

10 Apr 08:40

Choose a tag to compare

Sync release

Bumps helix-context to v0.3.0b3 to pick up ribosome pause/resume endpoints and the learn() timeout wrapper.

Inherited from helix-context v0.3.0b3

  • /admin/ribosome/pause + /resume + /status — unload the ribosome's LLM model without restarting Helix, for VRAM contention scenarios
  • learn() timeout wrapper — 15s hard cap on background replicate calls, prevents server crashes when Ollama queues back up

Install

```bash
pip install agentome==0.3.0b3
```

Use case

```bash

Free the ribosome model from Ollama for a benchmark

curl -X POST localhost:11437/admin/ribosome/pause
curl -X POST localhost:11434/api/generate -d '{"model": "gemma4:e4b", "keep_alive": 0, "prompt": ""}'

Run your benchmark

python benchmarks/bench_needle_1000.py

Restore normal operation

curl -X POST localhost:11437/admin/ribosome/resume
```

🤖 Generated with Claude Code

v0.3.0b2 — Sync with Helix Context v0.3.0b2

10 Apr 07:56

Choose a tag to compare

Sync release

Bumps helix-context dependency to v0.3.0b2 to pick up agent-friendly /context response fields and hot-reload admin endpoints.

What's new (inherited from helix-context v0.3.0b2)

Agent metadata in /context responses

  • recommendation (trust / verify / refresh / reread_raw)
  • citations (gene_id + source + score per expressed gene)
  • latency_ms, compression_ratio, moe_mode
  • Optional verbose: true adds promoter tags

Hot-reload admin endpoints

  • POST /admin/reload — full runtime refresh (config + genome + ΣĒMA cache)
  • POST /admin/sema/rebuild — force rebuild vector cache
  • POST /admin/checkpoint?mode=TRUNCATE — flush WAL

Performance

Post-restart needle benchmark: 10/10 retrieval, 1.0s avg latency (down from 21-120s). The ΣĒMA Mode B vector cache fix from v0.2.0b2 is finally active.

Install

```bash
pip install agentome==0.3.0b2
```

🤖 Generated with Claude Code

v0.3.0b1 — Sync with Helix Context v0.3.0b1

10 Apr 07:33

Choose a tag to compare

Sync release

Bumps helix-context to v0.3.0b1 to pick up tree-sitter AST chunking, BABILong multi-hop benchmark, and the training runbook for DeBERTa re-train.

No API changes in agentome itself — all new features inherited automatically through the thin wrapper.

Installation

```bash
pip install agentome==0.3.0b1
```

For tree-sitter AST chunking:
```bash
pip install helix-context[ast]
```

Inherited from helix-context v0.3.0b1

  • Tree-sitter AST chunking (opt-in) — Python, Rust, JavaScript, TypeScript
  • BABILong multi-hop benchmark — two- and three-hop reasoning tests
  • Training runbook — DeBERTa re-train workflow documented
  • Authority boosts (from v0.2.0b2) — distinguishes "about X" from "mentions X"

Research

🤖 Generated with Claude Code

v0.3.0b5 — sync with helix-context v0.3.0b5 (Headroom adoption)

10 Apr 21:23

Choose a tag to compare

Syncs the `agentome` wrapper to helix-context v0.3.0b5.

`pip install agentome` now pulls in:

  • Headroom integration — CPU-resident semantic compression via Kompress, LogCompressor, DiffCompressor, CodeAwareCompressor (by Tejas Chopra, Apache-2.0). Optional `[codec]` extra on helix-context — install as `pip install agentome[codec]` once that extra surfaces here, or install upstream directly.
  • Cross-session restart announcement protocol — two agents sharing a Helix server can now coordinate restarts without misreading outages as crashes
  • Dynamic budget tiers — TIGHT/FOCUSED/BROAD expression window sizing based on retrieval score confidence
  • Benchmark state monitor — VRAM / hang / contamination detection during long-running benchmarks
  • `HELIX_DISABLE_HEADROOM` env toggle — A/B benchmark Headroom vs legacy truncation on the same genome without reverting code

Research retrospective

For the full adoption story + forensic analysis of our N=20 benchmark — including the cross-session sync where laude picked up raude's Issue #3 and ran a v2 harness implementing the fix within 15 minutes — see Discussion #2 on helix-context.

Headline finding from that discussion: the "extraction wall" we thought we saw was a benchmark harness artifact. When v2 retrieval actually surfaces the right gene, qwen3:8b extracts the answer correctly on every single attempt (4/4 = 100% answer-given-retrieval rate). All future retrieval-quality work targets ingest-time noise dilution (Struggle 1), not extraction.

Commits in this sync

  • `c56d127` release: v0.3.0b5 — sync with helix-context v0.3.0b5

(agentome remains a thin re-export wrapper around helix-context; all runtime behavior changes live upstream.)

— raude, Claude Code Opus 4.6 (1M context)

v0.2.0b1 — Sync with Helix Context v0.2.0b2

10 Apr 07:17

Choose a tag to compare

Sync release

Bumps helix-context dependency to v0.2.0b2 to pick up the SIKE validation, MoE decoder, retrieval authority boosts, and score-gated expression shipped in the upstream package.

No API changes in agentome itself — all new features inherited automatically through the thin wrapper.

Installation

```bash
pip install agentome==0.2.0b1
```

is equivalent to:

```bash
pip install helix-context==0.2.0b2
```

What you get (inherited from helix-context v0.2.0b2)

  • SIKE scale-invariant retrieval — 10/10 across 0.6B to 8B models
  • MoE-aware decoder — answer slate for small/SWA models
  • Retrieval authority boosts — distinguishes "about X" from "mentions X"
  • Score-gated expression — drops weak-scoring tail candidates
  • ΣĒMA cold-storage tiers — OPEN / EUCHROMATIN / HETEROCHROMATIN
  • WAL durability — periodic checkpoint strategy
  • 179 tests passing upstream

Research

🤖 Generated with Claude Code

v0.1.0b1

08 Apr 23:11

Choose a tag to compare

v0.1.0b1 Pre-release
Pre-release

Initial beta release — genome-based context compression for local LLMs