Problem
The current docs/opencode.md covers basic setup but is missing provider setup, model selection, and Ollama Cloud as a worked example.
Proposed Content
Add an "Ollama Cloud Example" section to docs/opencode.md:
Deploy OpenAB with OpenCode + Ollama Cloud
1. Install via Helm
helm repo add openab https://openabdev.github.io/openab
helm repo update
helm install openab openab/openab --version 0.7.6 \
--set agents.kiro.enabled=false \
--set agents.opencode.enabled=true \
--set agents.opencode.command=opencode \
--set 'agents.opencode.args={acp}' \
--set agents.opencode.image=ghcr.io/openabdev/openab-opencode:0.7.6 \
--set agents.opencode.discord.botToken="$DISCORD_BOT_TOKEN" \
--set-string 'agents.opencode.discord.allowedChannels[0]=YOUR_CHANNEL_ID' \
--set agents.opencode.workingDir=/home/node \
--set agents.opencode.pool.maxSessions=3
2. Authenticate Ollama Cloud
kubectl exec -it deployment/openab-opencode -- opencode auth login -p "ollama cloud"
3. Set default model
The key is opencode.json in the working directory (/home/node). OpenCode reads it as project-level config:
kubectl exec deployment/openab-opencode -- sh -c \
'echo "{\"model\": \"ollama-cloud/gemini-3-flash-preview\"}" > /home/node/opencode.json'
This file is on the PVC and persists across restarts.
To list available models:
kubectl exec deployment/openab-opencode -- opencode models
4. Restart to pick up config
kubectl rollout restart deployment/openab-opencode
5. Verify
kubectl logs deployment/openab-opencode --tail=5
# Should show: discord bot connected
@mention the bot in your Discord channel to start chatting.
Providers
OpenCode supports multiple providers. You can add any of them via opencode auth login:
- Ollama Cloud — free tier available, models like
gemini-3-flash-preview, qwen3-coder-next, deepseek-v3.2
- OpenCode subscription — if you have an OpenCode subscription, you can use OpenCode-provided models like
opencode/big-pickle or opencode/gpt-5-nano without additional provider setup
To list all available models across configured providers:
kubectl exec deployment/openab-opencode -- opencode models
Architecture
OpenCode supports 75+ LLM providers via the AI SDK, making it the most flexible backend for OpenAB. Users bring their own provider — no separate API keys per backend needed.
┌──────────┐ Discord ┌────────┐ ACP stdio ┌──────────┐ ┌─────────────────────┐
│ Discord │ ◄────────► │ OpenAB │ ◄─────────► │ OpenCode │ ──► │ LLM Providers │
│ Users │ Gateway │ (Rust) │ JSON-RPC │ (ACP) │ │ │
└──────────┘ └────────┘ └──────────┘ │ ┌─────────────────┐ │
│ │ │ Ollama Cloud │ │
opencode.json │ │ (free tier) │ │
sets model │ ├─────────────────┤ │
│ │ OpenAI │ │
│ │ (ChatGPT Plus) │ │
│ ├─────────────────┤ │
│ │ Anthropic │ │
│ ├─────────────────┤ │
│ │ AWS Bedrock │ │
│ ├─────────────────┤ │
│ │ GitHub Copilot │ │
│ ├─────────────────┤ │
│ │ Groq │ │
│ ├─────────────────┤ │
│ │ OpenRouter │ │
│ ├─────────────────┤ │
│ │ Ollama (local) │ │
│ ├─────────────────┤ │
│ │ 75+ more... │ │
│ └─────────────────┘ │
└─────────────────────┘
See the full provider list at https://opencode.ai/docs/providers/
Problem
The current
docs/opencode.mdcovers basic setup but is missing provider setup, model selection, and Ollama Cloud as a worked example.Proposed Content
Add an "Ollama Cloud Example" section to
docs/opencode.md:Deploy OpenAB with OpenCode + Ollama Cloud
1. Install via Helm
2. Authenticate Ollama Cloud
3. Set default model
The key is
opencode.jsonin the working directory (/home/node). OpenCode reads it as project-level config:This file is on the PVC and persists across restarts.
To list available models:
kubectl exec deployment/openab-opencode -- opencode models4. Restart to pick up config
5. Verify
kubectl logs deployment/openab-opencode --tail=5 # Should show: discord bot connected@mentionthe bot in your Discord channel to start chatting.Providers
OpenCode supports multiple providers. You can add any of them via
opencode auth login:gemini-3-flash-preview,qwen3-coder-next,deepseek-v3.2opencode/big-pickleoropencode/gpt-5-nanowithout additional provider setupTo list all available models across configured providers:
kubectl exec deployment/openab-opencode -- opencode modelsArchitecture
OpenCode supports 75+ LLM providers via the AI SDK, making it the most flexible backend for OpenAB. Users bring their own provider — no separate API keys per backend needed.
See the full provider list at https://opencode.ai/docs/providers/