Skip to content

razertory/chatframe_release

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 

Repository files navigation

ChatFrame

A cross-platform desktop chatbot that unifies access to multiple LLM providers, supports MCP (Model Context Protocol) servers, and provides built-in retrieval-augmented generation (RAG) for local files. Available for macOS (Apple Silicon & Intel) and Windows (x86_64).

Overview

ChatFrame is built for developers and power users who want direct access to AI models, and deliver maximum value of your tokens . ChatFrame delivers a single, polished interface for interacting with language models while giving users full control over their data. A plug-in system for custom tools via MCP and out-of-the-box RAG let you turn any PDF, text, or code file into searchable context—without uploading data to third-party services.

Quick Start

  1. Download the latest release for your OS from chatframe.co.
  2. Launch the app and open Providers to add your API keys. You can validate your configuration by clicking Verify.
  3. Click the first Chat button one the left to start

Supported Official LLM Providers

  • DeepSeek
  • OpenAI
  • Anthropic
  • xAI
  • OpenRouter
  • Groq
  • Moonshot
  • Cerebras
  • Qwen
  • GoogleAIStudio
  • Zhipu
  • Github Copilot

We update the provider models in the cloud so you don't need to update the desktop. Just click the sync button.

Also ChatFrame provide Custom Providers(OpenAI compatible) for some users like Ollama or self-hosted llms as long as the API is compatible to OpenAI like

Recently(2025.11) We have implemented a new provider integration: Github Copilot. You can select the Github Copilot and click login. Then you can reuse your subscribtion.

MCP (Model Context Protocol) Support

ChatFrame is listed on the MCP official website:https://modelcontextprotocol.io/clients#chatframe, supports SSE, Streamable HTTP, and Stdio MCP servers. Go to settings -> MCP to add your MCP servers.

Runtime Environment(STDIO)

Stdio MCP servers require a local runtime environment. For example, to use the Postgres MCP server,

{
  "postgres": {
    "command": "npx", // use your own Node runtime
    "args": [
      "-y",
      "@modelcontextprotocol/server-postgres",
      "postgresql://localhost/mydb"
    ]
  }
}

Why aren’t Node.js and Python bundled? I'd prefer to let users control their own runtime environments. Bundled interpreters can introduce version conflicts and increase the application footprint.

Agents

Setup instructions to build your own Agents.

Chat

  • Invoke any MCP tool from within a conversation, type @ to open MCP Servers list and use the tools to finish tasks by AI
  • Multimodal input (images and text files)
  • Live artifacts – render React components, HTML, SVG, or code snippets in an isolated sandbox that updates in real time

Pricing

ChatFrame is currently a paid software product, we support a life time license once the payment is completed.

Feature Map

  • Chat
    • Text chat
    • Model selection
    • Tool selection via MCP servers
    • Artifacts: create interactive content generated by the LLM
    • Attachments: upload images, PDFs, and text files (PDFs are parsed locally)
  • Projects
    • Create and then manage projects
    • Start a chat from a project
  • MCP Servers
    • Install any kind of MCP server
    • Run the server by clicking start
  • Model Providers
    • offical LLMs
    • custom LLMs
  • Settings
    • App
      • Appearance: Light / Dark
      • Updates: ChatFrame downloads updates in the background and displays an install button when ready
    • Shortcuts
      • New Chat: ⌘N
      • Toggle Sidebar: ⌘B
      • Open Settings ⌘s
    • Advanced
      • Proxy URL: sets the all_proxy environment variable. Leave blank to disable proxying. When configured, all LLM API requests are routed through the specified proxy.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published