From 78bf4dbb8ecf557a197d319bfce1c656b624a347 Mon Sep 17 00:00:00 2001 From: thepagent Date: Wed, 15 Apr 2026 20:35:40 -0400 Subject: [PATCH] docs: add provider setup, model selection, and Ollama Cloud example to opencode guide Closes #377 --- docs/opencode.md | 79 +++++++++++++++++++++++++++++++++++++++++++++--- 1 file changed, 75 insertions(+), 4 deletions(-) diff --git a/docs/opencode.md b/docs/opencode.md index 8d452d0e..97c4117c 100644 --- a/docs/opencode.md +++ b/docs/opencode.md @@ -2,6 +2,26 @@ OpenCode supports ACP natively via the `acp` subcommand — no adapter needed. +OpenCode supports [75+ LLM providers](https://opencode.ai/docs/providers/) via the AI SDK, making it the most flexible backend for OpenAB. Users bring their own provider — no separate API keys per backend needed. + +``` +┌──────────┐ Discord ┌────────┐ ACP stdio ┌──────────┐ ┌───────────────────┐ +│ Discord │◄────────► │ OpenAB │◄────────► │ OpenCode │──►│ LLM Providers │ +│ Users │ Gateway │ (Rust) │ JSON-RPC │ (ACP) │ │ │ +└──────────┘ └────────┘ └──────────┘ │ ┌───────────────┐ │ + │ │ │ Ollama Cloud │ │ + opencode.json │ │ OpenAI │ │ + sets model │ │ Anthropic │ │ + │ │ AWS Bedrock │ │ + │ │ GitHub Copilot│ │ + │ │ Groq │ │ + │ │ OpenRouter │ │ + │ │ Ollama (local)│ │ + │ │ 75+ more... │ │ + │ └───────────────┘ │ + └───────────────────┘ +``` + ## Docker Image ```bash @@ -15,12 +35,14 @@ The image installs `opencode-ai` globally via npm on `node:22-bookworm-slim`. ```bash helm install openab openab/openab \ --set agents.kiro.enabled=false \ - --set agents.opencode.discord.botToken="$DISCORD_BOT_TOKEN" \ - --set-string 'agents.opencode.discord.allowedChannels[0]=YOUR_CHANNEL_ID' \ - --set agents.opencode.image=ghcr.io/openabdev/openab-opencode:latest \ + --set agents.opencode.enabled=true \ --set agents.opencode.command=opencode \ --set 'agents.opencode.args={acp}' \ - --set agents.opencode.workingDir=/home/node + --set agents.opencode.image=ghcr.io/openabdev/openab-opencode:latest \ + --set agents.opencode.discord.botToken="$DISCORD_BOT_TOKEN" \ + --set-string 'agents.opencode.discord.allowedChannels[0]=YOUR_CHANNEL_ID' \ + --set agents.opencode.workingDir=/home/node \ + --set agents.opencode.pool.maxSessions=3 ``` > Set `agents.kiro.enabled=false` to disable the default Kiro agent. @@ -46,7 +68,56 @@ Follow the browser OAuth flow, then restart the pod: kubectl rollout restart deployment/openab-opencode ``` +## Providers + +OpenCode supports multiple providers. Add any of them via `opencode auth login`: + +- **Ollama Cloud** — free tier available, models like `gemini-3-flash-preview`, `qwen3-coder-next`, `deepseek-v3.2` +- **OpenCode Zen / Go** — tested and verified models provided by the OpenCode team (e.g. `opencode/big-pickle`, `opencode/gpt-5-nano`) +- **OpenAI, Anthropic, AWS Bedrock, GitHub Copilot, Groq, OpenRouter** — and [75+ more](https://opencode.ai/docs/providers/) + +To list all available models across configured providers: + +```bash +kubectl exec deployment/openab-opencode -- opencode models +``` + +## Example: Ollama Cloud with gemini-3-flash-preview + +### 1. Authenticate Ollama Cloud + +```bash +kubectl exec -it deployment/openab-opencode -- opencode auth login -p "ollama cloud" +``` + +### 2. Set default model + +Create `opencode.json` in the working directory (`/home/node`). OpenCode reads it as project-level config: + +```bash +kubectl exec deployment/openab-opencode -- sh -c \ + 'echo "{\"model\": \"ollama-cloud/gemini-3-flash-preview\"}" > /home/node/opencode.json' +``` + +This file is on the PVC and persists across restarts. + +### 3. Restart to pick up config + +```bash +kubectl rollout restart deployment/openab-opencode +``` + +### 4. Verify + +```bash +kubectl logs deployment/openab-opencode --tail=5 +# Should show: discord bot connected +``` + +`@mention` the bot in your Discord channel to start chatting. + ## Notes - **Tool authorization**: OpenCode handles tool authorization internally and never emits `session/request_permission` — all tools run without user confirmation, equivalent to `--trust-all-tools` on other backends. +- **Model selection**: Set the default model via `opencode.json` in the working directory using the `provider/model` format (e.g. `ollama-cloud/gemini-3-flash-preview`). - **Frequent releases**: OpenCode releases very frequently (often daily). The pinned version in `Dockerfile.opencode` should be bumped via a dedicated PR when an update is needed.