A research lab — Systems · Intelligence · Precision. UltraCompress is our flagship publicly-shipped product.
Extreme compression infrastructure for large language models. The only sub-3-bits-per-weight method we evaluated on a 6-model head-to-head benchmark with zero catastrophic failures.
pip install ultracompress
Apache-2.0 CLI. Pre-compressed reference models distributed via the Hugging Face Hub (rolling release through April–May 2026).
Patent pending — USPTO 64/049,511 + 64/049,517, filed April 25, 2026.
- v0.1 alpha — pre-compressed reference models for Qwen3, Llama, Mistral families releasing throughout April–May 2026
- v0.2 (Q3 2026) —
uc compress(self-compression of customer models) + Track B architectural-compression variants + native exports to llama.cpp, vLLM, TensorRT-LLM, CoreML
- Pilots / commercial → founder@sipsalabs.com
- Patents / licensing → legal@sipsalabs.com
- Press / media → press@sipsalabs.com
- Security disclosure → security@sipsalabs.com
- General → hello@sipsalabs.com