A Model Context Protocol (MCP) server built in Erlang that orchestrates requests to different LLMs (ChatGPT, Gemini, Claude, etc.) using OTP principles.
- OTP-based architecture for robust, fault-tolerant operation
- Support for multiple LLM providers (OpenAI, Google Gemini, Anthropic Claude)
- Dynamic model registration and management
- RESTful API for chat completions and model listing
- Automatic failover and load balancing between models
The application follows OTP design principles with the following components:
mcp_app: Main application modulemcp_sup: Top-level supervisormcp_model_sup: Dynamic supervisor for LLM connectionsmcp_orchestrator: Central orchestrator for routing requestsmcp_model: GenServer implementation for each LLM type- HTTP API handlers for external communication
- Erlang/OTP 24 or later
- Rebar3
cd mcp_server
rebar3 compilerebar3 shellPOST /api/v1/chat: Send a chat request to an LLMGET /api/v1/models: List all available models
LLM API keys and other configuration should be provided through environment variables or a config file.
MIT