Sync release
Bumps helix-context to v0.3.0b3 to pick up ribosome pause/resume endpoints and the learn() timeout wrapper.
Inherited from helix-context v0.3.0b3
/admin/ribosome/pause+/resume+/status— unload the ribosome's LLM model without restarting Helix, for VRAM contention scenarioslearn()timeout wrapper — 15s hard cap on background replicate calls, prevents server crashes when Ollama queues back up
Install
```bash
pip install agentome==0.3.0b3
```
Use case
```bash
Free the ribosome model from Ollama for a benchmark
curl -X POST localhost:11437/admin/ribosome/pause
curl -X POST localhost:11434/api/generate -d '{"model": "gemma4:e4b", "keep_alive": 0, "prompt": ""}'
Run your benchmark
python benchmarks/bench_needle_1000.py
Restore normal operation
curl -X POST localhost:11437/admin/ribosome/resume
```
🤖 Generated with Claude Code