A powerful, multi-assistant AI chat extension for VS Code. Chat with any OpenAI-compatible API, manage multiple assistants, organize sessions, and extend capabilities via MCP — all without leaving your editor.
- Multi-Assistant Management — Create, organize, and switch between multiple AI assistants with independent system prompts, models, and parameters.
- Multi-Provider Support — Works with OpenAI, Gemini, OpenRouter, Ollama, and any custom OpenAI-compatible endpoint.
- Session Management — Every assistant has its own session history. Rename, search, export, or clear sessions anytime.
- MCP Integration — Connect to MCP servers (stdio, SSE, HTTP) to extend AI capabilities with external tools, resources, and prompts.
- Streaming Responses — Real-time streaming output with typing effect. Interrupt generation at any time.
- Multimodal Chat — Paste images directly into the chat, with KaTeX math rendering and Mermaid diagram support.
- Tool Calling — Local function tools + MCP remote tools with user confirmation for dangerous actions.
- Model Capability Inference — Auto-detects model type (chat, image, video, audio, embedding, rerank) and capabilities (vision, reasoning, tools, webSearch).
- Bilingual UI — Full Chinese and English support with runtime language switching.
- Data Backup & Migration — Export/import structured ZIP backups. Automatic migration from legacy SQLite storage.
| Platform | Method |
|---|---|
| VS Code | Search "ChatBuddy" in the Extensions Marketplace, or install online |
| VSCodium / Cursor | Install from Open VSX |
| Manual | Download .vsix from GitHub Releases |
- Open the ChatBuddy panel in the VS Code sidebar (look for the robot icon).
- Click Model Config in the Settings view to add your AI provider and API key.
- Create a new assistant in the Assistants view.
- Start chatting!
ChatBuddy is a generic OpenAI-compatible API client. It works with any API endpoint that implements the OpenAI protocol — just enter your base URL and API key.
- API formats: Supports both
chat/completionsandresponsesendpoints. Configurable per provider. - Model auto-fetch: Automatically retrieves model lists from providers that expose a
/modelsendpoint.
Chat Interface![]() |
Provider Setup![]() |
Model Management![]() |
Default Models![]() |
MCP Settings![]() |
Assistant Profile![]() |
Data Management![]() |
Local Backup![]() |
Other Settings![]() |
For detailed development documentation, see the
docs/directory.
# Clone and setup
git clone https://github.com/Zheng404/ChatBuddy.git
cd ChatBuddy
npm install
npm run compile
# Press F5 in VS Code to launch the Extension Development Host
See CHANGELOG.md for release history.
This project is licensed under the MIT License.









