An OpenAI-compatible reverse proxy you run yourself. It gives you the features of an AI gateway (guardrails, budgets, rate limits, multi-provider routing) but under your control from your client.
-
Updated
Mar 6, 2026 - Go
An OpenAI-compatible reverse proxy you run yourself. It gives you the features of an AI gateway (guardrails, budgets, rate limits, multi-provider routing) but under your control from your client.
Provider-agnostic multi-agent coding orchestrator in Go. Unified router for any LLM provider (Ollama local+cloud, OpenAI, Anthropic, Google Gemini) with role-based model assignment, fallback chains, and cost tracking. Inspired by gastown and LiteLLM.
Kai-zen-Tunnel: local AI bridge with virtual keys for web and API providers
Add a description, image, and links to the provider-routing topic page so that developers can more easily learn about it.
To associate your repository with the provider-routing topic, visit your repo's landing page and select "manage topics."