Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -184,15 +184,15 @@ coding = "ollama/qwen3"

```toml
[llm.provider.my-provider]
api_type = "openai_completions" # or "anthropic"
api_type = "openai_completions" # or "openai_chat_completions", "openai_responses", "anthropic"
base_url = "https://my-llm-host.example.com"
api_key = "env:MY_PROVIDER_KEY"

[defaults.routing]
channel = "my-provider/my-model"
```

Additional built-in providers include **NVIDIA**, **MiniMax**, **Moonshot AI (Kimi)**, and **Z.AI Coding Plan** — configure with `nvidia_key`, `minimax_key`, `moonshot_key`, or `zai_coding_plan_key` in `[llm]`.
Additional built-in providers include **Kilo Gateway**, **NVIDIA**, **MiniMax**, **Moonshot AI (Kimi)**, and **Z.AI Coding Plan** — configure with `kilo_key`, `nvidia_key`, `minimax_key`, `moonshot_key`, or `zai_coding_plan_key` in `[llm]`.

### Skills

Expand Down Expand Up @@ -381,7 +381,7 @@ Read the full vision in the [roadmap](docs/content/docs/(deployment)/roadmap.mdx
### Prerequisites

- **Rust** 1.85+ ([rustup](https://rustup.rs/))
- An LLM API key from any supported provider (Anthropic, OpenAI, OpenRouter, Z.ai, Groq, Together, Fireworks, DeepSeek, xAI, Mistral, NVIDIA, MiniMax, Moonshot AI, OpenCode Zen) — or use `spacebot auth login` for Anthropic OAuth
- An LLM API key from any supported provider (Anthropic, OpenAI, OpenRouter, Kilo Gateway, Z.ai, Groq, Together, Fireworks, DeepSeek, xAI, Mistral, NVIDIA, MiniMax, Moonshot AI, OpenCode Zen) — or use `spacebot auth login` for Anthropic OAuth

### Build and Run

Expand Down
13 changes: 9 additions & 4 deletions docs/content/docs/(configuration)/config.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ spacebot --config /path/to.toml # CLI override
anthropic_key = "env:ANTHROPIC_API_KEY"
openai_key = "env:OPENAI_API_KEY"
openrouter_key = "env:OPENROUTER_API_KEY"
kilo_key = "env:KILO_API_KEY"
zhipu_key = "env:ZHIPU_API_KEY"
groq_key = "env:GROQ_API_KEY"
together_key = "env:TOGETHER_API_KEY"
Expand Down Expand Up @@ -173,7 +174,7 @@ anthropic_key = "env:ANTHROPIC_API_KEY"

This reads `ANTHROPIC_API_KEY` from the environment at startup. If the variable is unset, the value is treated as missing.

LLM keys also have implicit env fallbacks — if no key is set in the TOML, Spacebot checks `ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, and `OPENROUTER_API_KEY` automatically.
LLM keys also have implicit env fallbacks — if no key is set in the TOML, Spacebot checks `ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, `OPENROUTER_API_KEY`, and `KILO_API_KEY` automatically.

## Env-Only Mode

Expand Down Expand Up @@ -208,6 +209,7 @@ Model names include the provider as a prefix:
| Anthropic | `anthropic/<model>` | `anthropic/claude-sonnet-4-20250514` |
| OpenAI | `openai/<model>` | `openai/gpt-4o` |
| OpenRouter | `openrouter/<provider>/<model>` | `openrouter/anthropic/claude-sonnet-4-20250514` |
| Kilo Gateway | `kilo/<provider>/<model>` | `kilo/anthropic/claude-sonnet-4.5` |
| Custom provider | `<provider_id>/<model>` | `my_openai/gpt-4o-mini` |

You can mix providers across process types. See [Routing](/docs/routing) for the full routing system.
Expand Down Expand Up @@ -329,6 +331,7 @@ If you define a custom provider with the same ID as a legacy key, your custom co
| `anthropic_key` | string | None | Anthropic API key (or `env:VAR_NAME`) |
| `openai_key` | string | None | OpenAI API key (or `env:VAR_NAME`) |
| `openrouter_key` | string | None | OpenRouter API key (or `env:VAR_NAME`) |
| `kilo_key` | string | None | Kilo Gateway API key (or `env:VAR_NAME`) |
| `zhipu_key` | string | None | Zhipu AI (GLM) API key (or `env:VAR_NAME`) |
| `groq_key` | string | None | Groq API key (or `env:VAR_NAME`) |
| `together_key` | string | None | Together AI API key (or `env:VAR_NAME`) |
Expand All @@ -344,24 +347,26 @@ Custom providers allow configuring LLM providers with custom endpoints and API t

```toml
[llm.provider.<id>]
api_type = "anthropic" # Required - one of: anthropic, openai_completions, openai_responses
api_type = "anthropic" # Required - see supported values below
base_url = "https://api..." # Required - valid URL
api_key = "env:API_KEY" # Required - API key (supports env:VAR_NAME format)
name = "My Provider" # Optional - friendly name for display
```

| Field | Type | Required | Description |
|-------|------|----------|-------------|
| `api_type` | string | Yes | API protocol type. One of: `anthropic` (Anthropic Messages API), `openai_completions` (OpenAI Chat Completions-compatible API), or `openai_responses` (OpenAI Responses API-compatible) |
| `api_type` | string | Yes | API protocol type. One of: `anthropic`, `openai_completions`, `openai_chat_completions`, `openai_responses`, `gemini`, or `kilo_gateway` |
| `base_url` | string | Yes | Base URL of the API endpoint. Must be a valid URL (including protocol) |
| `api_key` | string | Yes | API key for authentication. Supports `env:VAR_NAME` syntax to reference environment variables |
| `name` | string | No | Optional friendly name for the provider (displayed in logs and UI) |

> Note:
> - For `openai_completions` and `openai_responses`, configure `base_url` as the provider root URL (usually without a trailing `/v1`).
> - For `openai_completions`, `openai_chat_completions`, and `openai_responses`, configure `base_url` as the provider root URL (usually without a trailing `/v1`).
> - Spacebot appends the endpoint path automatically:
> - `openai_completions` -> `/v1/chat/completions`
> - `openai_chat_completions` -> `/chat/completions`
> - `openai_responses` -> `/v1/responses`
> - `kilo_gateway` -> `/chat/completions` plus Kilo-required `HTTP-Referer` / `X-Title` headers
> - If you include `/v1` in `base_url`, requests can end up with duplicated paths such as `/v1/v1/...`.

**Provider ID Requirements:**
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/(deployment)/roadmap.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ The full message-in → LLM → response-out pipeline is wired end-to-end across
- **Config** — hierarchical TOML with `Config`, `AgentConfig`, `ResolvedAgentConfig`, `Binding`, `MessagingConfig`. File watcher with event filtering and content hash debounce for hot-reload.
- **Multi-agent** — per-agent database isolation, `Agent` struct bundles all dependencies
- **Database connections** — SQLite + LanceDB + redb per-agent, migrations for all tables
- **LLM** — `SpacebotModel` implements Rig's `CompletionModel`, routes through `LlmManager` via HTTP with retries and fallback chains across 11 providers (Anthropic, OpenAI, OpenRouter, Z.ai, Groq, Together, Fireworks, DeepSeek, xAI, Mistral, OpenCode Zen)
- **LLM** — `SpacebotModel` implements Rig's `CompletionModel`, routes through `LlmManager` via HTTP with retries and fallback chains across 12 providers (Anthropic, OpenAI, OpenRouter, Kilo Gateway, Z.ai, Groq, Together, Fireworks, DeepSeek, xAI, Mistral, OpenCode Zen)
- **Model routing** — `RoutingConfig` with process-type defaults, task overrides, fallback chains
- **Memory** — full stack: types, SQLite store (CRUD + graph), LanceDB (embeddings + vector + FTS), fastembed, hybrid search (RRF fusion). `memory_type` filter wired end-to-end through SearchConfig. `total_cmp` for safe sorting.
- **Memory maintenance** — decay + prune implemented
Expand Down
3 changes: 2 additions & 1 deletion docs/content/docs/(getting-started)/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ See [Docker deployment](/docs/docker) for image variants, compose files, and con

- **Rust 1.85+** — `rustup update stable`
- **Bun** (optional, for the web UI) — `curl -fsSL https://bun.sh/install | bash`
- **An LLM API key** — Anthropic, OpenAI, or OpenRouter
- **An LLM API key** — Anthropic, OpenAI, OpenRouter, or Kilo Gateway

### Install

Expand Down Expand Up @@ -77,6 +77,7 @@ Create `~/.spacebot/config.toml`:
[llm]
anthropic_key = "sk-ant-..."
# or: openrouter_key = "sk-or-..."
# or: kilo_key = "sk-..."
# or: openai_key = "sk-..."
# Keys also support env references: anthropic_key = "env:ANTHROPIC_API_KEY"

Expand Down
1 change: 1 addition & 0 deletions interface/src/api/client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -712,6 +712,7 @@ export interface ProviderStatus {
openai: boolean;
openai_chatgpt: boolean;
openrouter: boolean;
kilo: boolean;
zhipu: boolean;
groq: boolean;
together: boolean;
Expand Down
2 changes: 2 additions & 0 deletions interface/src/components/ModelSelect.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ interface ModelSelectProps {
const PROVIDER_LABELS: Record<string, string> = {
anthropic: "Anthropic",
openrouter: "OpenRouter",
kilo: "Kilo Gateway",
openai: "OpenAI",
"openai-chatgpt": "ChatGPT Plus (OAuth)",
deepseek: "DeepSeek",
Expand Down Expand Up @@ -128,6 +129,7 @@ export function ModelSelect({

const providerOrder = [
"openrouter",
"kilo",
"anthropic",
"openai",
"openai-chatgpt",
Expand Down
21 changes: 21 additions & 0 deletions interface/src/lib/providerIcons.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -90,6 +90,26 @@ function OllamaIcon({ size = 24, className }: IconProps) {
);
}

function KiloIcon({ size = 24, className }: IconProps) {
return (
<svg
width={size}
height={size}
viewBox="0 0 100 100"
fill="none"
xmlns="http://www.w3.org/2000/svg"
className={className}
aria-hidden="true"
focusable="false"
>
<path
fill="currentColor"
d="M0,0v100h100V0H0ZM92.5925926,92.5925926H7.4074074V7.4074074h85.1851852v85.1851852ZM61.1111044,71.9096084h9.2592593v7.4074074h-11.6402116l-5.026455-5.026455v-11.6402116h7.4074074v9.2592593ZM77.7777711,71.9096084h-7.4074074v-9.2592593h-9.2592593v-7.4074074h11.6402116l5.026455,5.026455v11.6402116ZM46.2962963,61.1114207h-7.4074074v-7.4074074h7.4074074v7.4074074ZM22.2222222,53.7040133h7.4074074v16.6666667h16.6666667v7.4074074h-19.047619l-5.026455-5.026455v-19.047619ZM77.7777711,38.8888889v7.4074074h-24.0740741v-7.4074074h8.2781918v-9.2592593h-8.2781918v-7.4074074h10.6591442l5.026455,5.026455v11.6402116h8.3884749ZM29.6296296,30.5555556h9.2592593l7.4074074,7.4074074v8.3333333h-7.4074074v-8.3333333h-9.2592593v8.3333333h-7.4074074v-24.0740741h7.4074074v8.3333333ZM46.2962963,30.5555556h-7.4074074v-8.3333333h7.4074074v8.3333333Z"
/>
</svg>
);
}

export function ProviderIcon({ provider, className = "text-ink-faint", size = 24 }: ProviderIconProps) {
const iconProps: Partial<IconProps> = {
size,
Expand All @@ -101,6 +121,7 @@ export function ProviderIcon({ provider, className = "text-ink-faint", size = 24
openai: OpenAI,
"openai-chatgpt": OpenAI,
openrouter: OpenRouter,
kilo: KiloIcon,
groq: Groq,
mistral: Mistral,
gemini: Google,
Expand Down
16 changes: 12 additions & 4 deletions interface/src/routes/Settings.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,14 @@ const PROVIDERS = [
envVar: "OPENROUTER_API_KEY",
defaultModel: "openrouter/anthropic/claude-sonnet-4",
},
{
id: "kilo",
name: "Kilo Gateway",
description: "OpenAI-compatible multi-provider gateway",
placeholder: "sk-...",
envVar: "KILO_API_KEY",
defaultModel: "kilo/anthropic/claude-sonnet-4.5",
},
{
id: "opencode-zen",
name: "OpenCode Zen",
Expand Down Expand Up @@ -523,9 +531,9 @@ export function Settings() {
};

return (
<div className="flex h-full">
<div className="flex h-full min-h-0 overflow-hidden">
{/* Sidebar */}
<div className="flex w-52 flex-shrink-0 flex-col border-r border-app-line/50 bg-app-darkBox/20 overflow-y-auto">
<div className="flex min-h-0 w-52 flex-shrink-0 flex-col overflow-y-auto border-r border-app-line/50 bg-app-darkBox/20">
<div className="px-3 pb-1 pt-4">
<span className="text-tiny font-medium uppercase tracking-wider text-ink-faint">
Settings
Expand All @@ -545,13 +553,13 @@ export function Settings() {
</div>

{/* Content */}
<div className="flex flex-1 flex-col overflow-hidden">
<div className="flex min-h-0 flex-1 flex-col overflow-hidden">
<header className="flex h-12 items-center border-b border-app-line bg-app-darkBox/50 px-6">
<h1 className="font-plex text-sm font-medium text-ink">
{SECTIONS.find((s) => s.id === activeSection)?.label}
</h1>
</header>
<div className="flex-1 overflow-y-auto">
<div className="min-h-0 flex-1 overflow-y-auto overscroll-contain">
{activeSection === "providers" ? (
<div className="mx-auto max-w-2xl px-6 py-6">
{/* Section header */}
Expand Down
4 changes: 4 additions & 0 deletions src/api/models.rs
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,7 @@ fn direct_provider_mapping(models_dev_id: &str) -> Option<&'static str> {
match models_dev_id {
"anthropic" => Some("anthropic"),
"openai" => Some("openai"),
"kilo" => Some("kilo"),
"deepseek" => Some("deepseek"),
"xai" => Some("xai"),
"mistral" => Some("mistral"),
Expand Down Expand Up @@ -406,6 +407,9 @@ pub(super) async fn configured_providers(config_path: &std::path::Path) -> Vec<&
if has_key("openrouter_key", "OPENROUTER_API_KEY") {
providers.push("openrouter");
}
if has_key("kilo_key", "KILO_API_KEY") {
providers.push("kilo");
}
if has_key("zhipu_key", "ZHIPU_API_KEY") {
providers.push("zhipu");
}
Expand Down
22 changes: 18 additions & 4 deletions src/api/providers.rs
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ pub(super) struct ProviderStatus {
openai: bool,
openai_chatgpt: bool,
openrouter: bool,
kilo: bool,
zhipu: bool,
groq: bool,
together: bool,
Expand Down Expand Up @@ -127,6 +128,7 @@ fn provider_toml_key(provider: &str) -> Option<&'static str> {
"anthropic" => Some("anthropic_key"),
"openai" => Some("openai_key"),
"openrouter" => Some("openrouter_key"),
"kilo" => Some("kilo_key"),
"zhipu" => Some("zhipu_key"),
"groq" => Some("groq_key"),
"together" => Some("together_key"),
Expand Down Expand Up @@ -187,11 +189,17 @@ fn build_test_llm_config(provider: &str, credential: &str) -> crate::config::Llm
api_key: credential.to_string(),
name: None,
}),
"kilo" => Some(ProviderConfig {
api_type: ApiType::KiloGateway,
base_url: "https://api.kilo.ai/api/gateway".to_string(),
api_key: credential.to_string(),
name: Some("Kilo Gateway".to_string()),
}),
"zhipu" => Some(ProviderConfig {
api_type: ApiType::OpenAiCompletions,
api_type: ApiType::OpenAiChatCompletions,
base_url: "https://api.z.ai/api/paas/v4".to_string(),
api_key: credential.to_string(),
name: None,
name: Some("Z.AI (GLM)".to_string()),
}),
"groq" => Some(ProviderConfig {
api_type: ApiType::OpenAiCompletions,
Expand Down Expand Up @@ -266,10 +274,10 @@ fn build_test_llm_config(provider: &str, credential: &str) -> crate::config::Llm
name: None,
}),
"zai-coding-plan" => Some(ProviderConfig {
api_type: ApiType::OpenAiCompletions,
api_type: ApiType::OpenAiChatCompletions,
base_url: "https://api.z.ai/api/coding/paas/v4".to_string(),
api_key: credential.to_string(),
name: None,
name: Some("Z.AI Coding Plan".to_string()),
}),
_ => None,
};
Expand All @@ -282,6 +290,7 @@ fn build_test_llm_config(provider: &str, credential: &str) -> crate::config::Llm
anthropic_key: (provider == "anthropic").then(|| credential.to_string()),
openai_key: (provider == "openai").then(|| credential.to_string()),
openrouter_key: (provider == "openrouter").then(|| credential.to_string()),
kilo_key: (provider == "kilo").then(|| credential.to_string()),
zhipu_key: (provider == "zhipu").then(|| credential.to_string()),
groq_key: (provider == "groq").then(|| credential.to_string()),
together_key: (provider == "together").then(|| credential.to_string()),
Expand Down Expand Up @@ -433,6 +442,7 @@ pub(super) async fn get_providers(
openai,
openai_chatgpt,
openrouter,
kilo,
zhipu,
groq,
together,
Expand Down Expand Up @@ -474,6 +484,7 @@ pub(super) async fn get_providers(
has_value("openai_key", "OPENAI_API_KEY"),
openai_oauth_configured,
has_value("openrouter_key", "OPENROUTER_API_KEY"),
has_value("kilo_key", "KILO_API_KEY"),
has_value("zhipu_key", "ZHIPU_API_KEY"),
has_value("groq_key", "GROQ_API_KEY"),
has_value("together_key", "TOGETHER_API_KEY"),
Expand All @@ -497,6 +508,7 @@ pub(super) async fn get_providers(
std::env::var("OPENAI_API_KEY").is_ok(),
openai_oauth_configured,
std::env::var("OPENROUTER_API_KEY").is_ok(),
std::env::var("KILO_API_KEY").is_ok(),
std::env::var("ZHIPU_API_KEY").is_ok(),
std::env::var("GROQ_API_KEY").is_ok(),
std::env::var("TOGETHER_API_KEY").is_ok(),
Expand All @@ -520,6 +532,7 @@ pub(super) async fn get_providers(
openai,
openai_chatgpt,
openrouter,
kilo,
zhipu,
groq,
together,
Expand All @@ -540,6 +553,7 @@ pub(super) async fn get_providers(
|| providers.openai
|| providers.openai_chatgpt
|| providers.openrouter
|| providers.kilo
|| providers.zhipu
|| providers.groq
|| providers.together
Expand Down
Loading