A sophisticated Discord chat bot framework with multi-LLM support, MCP tool integration, and advanced context management.
- Multi-Participant Context: Honest representation of Discord conversations
- Multiple LLM Providers: Anthropic, AWS Bedrock, OpenAI-compatible, Google Gemini
- Prefill & Chat Modes: Full support for both conversation modes
- MCP Tool Integration: Native Model Context Protocol support
- Rolling Context: Efficient prompt caching with rolling message windows
- Hierarchical Configuration: YAML-based config with guild/channel overrides
- Image Support: Automatic image caching and vision input
- Advanced Features: History commands, m commands, dot messages
npm installCreate a discord_token file with your bot token:
echo "your-discord-bot-token" > discord_tokenSet environment variables (optional - these have defaults):
export CONFIG_PATH=./config # Default: ./config
export TOOLS_PATH=./tools # Default: ./tools
export CACHE_PATH=./cache # Default: ./cache
export LOG_LEVEL=info # Default: info
# Optional: Enable REST API
export API_BEARER_TOKEN=$(openssl rand -hex 32) # Generate secure token
export API_PORT=3000 # Default: 3000Note: The bot name is automatically determined from the Discord bot's username. Config is loaded from config/bots/{discord-username}.yaml.
To enable the REST API, set API_BEARER_TOKEN:
echo "your-secure-api-token" > api_token
export API_BEARER_TOKEN=$(cat api_token)The API will be available at http://localhost:3000 (or your configured API_PORT).
Create config/bots/your-bot-name.yaml:
name: Claude # Name used in LLM context
mode: prefill
continuationModel: claude-3-5-sonnet-20241022
temperature: 1.0
maxTokens: 4096
recencyWindowMessages: 400 # Optional: max messages
recencyWindowCharacters: 100000 # Optional: max characters
rollingThreshold: 50
includeImages: true
maxImages: 5
toolsEnabled: true
toolOutputVisible: falseCreate config/shared.yaml:
Anthropic:
vendors:
anthropic:
config:
anthropic_api_key: "sk-ant-..."
provides:
- "claude-3-5-sonnet-*"
- "claude-3-opus-*"
- "claude-sonnet-4-*"OpenAI (or compatible API):
vendors:
openai:
config:
openai_api_key: "sk-..."
openai_base_url: "https://api.openai.com/v1" # Optional, for compatible APIs
provides:
- "gpt-4o*"
- "gpt-4-turbo*"
- "gpt-3.5-turbo*"Notes on OpenAI provider:
- Only supports
mode: chat(not prefill - OpenAI doesn't allow partial assistant messages) - Images not yet supported (different format from Anthropic)
- For prefill support with OpenAI-compatible APIs, additional providers needed (OpenRouter, completions API)
# Development
npm run dev
# Production
npm run build
npm start- Architecture - Detailed architecture documentation
- Requirements - Full functional requirements
- Plugins - Plugin system documentation
- Deployment - Production deployment guide
- Agent Loop: Main orchestrator
- Discord Connector: Handles all Discord API interactions
- Context Builder: Transforms Discord → participant format
- LLM Middleware: Transforms participant → provider format
- Tool System: MCP integration and JSONL persistence
- Config System: Hierarchical YAML configuration
# Identity
name: BotName # Name used in LLM context (prefill labels, stop sequences)
# Model
mode: prefill # or 'chat'
continuationModel: claude-3-5-sonnet-20241022
temperature: 1.0
maxTokens: 4096
topP: 1.0
# Context
recencyWindowMessages: 400 # Optional: max messages
recencyWindowCharacters: 100000 # Optional: max characters
# When both specified, whichever limit is reached first is used
rollingThreshold: 50
# Images
includeImages: true
maxImages: 5
# Tools
toolsEnabled: true
toolOutputVisible: false
maxToolDepth: 100
# Retry
llmRetries: 3
discordBackoffMax: 32000
# Misc
systemPrompt: "Optional system prompt"
replyOnRandom: 0
replyOnName: false
maxQueuedReplies: 1History Command (requires authorized role):
.history botname
---
first: https://discord.com/channels/.../message_id
last: https://discord.com/channels/.../message_id
Config Command (must be pinned):
.config botname
---
temperature: 0.7
maxTokens: 2000
M Commands:
m continue- Activate bot without mention
The bot includes a comprehensive tracing system that captures every activation, including Discord context, LLM requests/responses, tool executions, and console logs.
Start the local web viewer to browse and search traces:
./trace serve
# Opens at http://localhost:3847Features:
- Search by Discord URL: Paste any Discord message URL to find related traces
- Full LLM request/response viewer: See exactly what was sent to the API
- Context transformation details: Understand how Discord messages became LLM context
- Console log filtering: Filter logs by level (debug, info, warn, error)
- Token usage & cost info: Track API usage per activation
# List recent traces
./trace list --limit 10
# Show trace summary
./trace explain <trace-id>
# View full LLM request
./trace request <trace-id>
# View full LLM response
./trace response <trace-id>
# View console logs
./trace logs <trace-id>Traces are stored in logs/traces/ as JSON files with an index at logs/traces/index.jsonl for fast lookups.
# Install dependencies
npm install
# Development mode
npm run dev
# Build
npm run build
# Lint
npm run lint
# Format
npm run format
# Test
npm testChapterX uses membrane as its LLM abstraction layer. Membrane is installed as a git dependency and won't auto-update with regular npm install.
To update membrane to the latest version:
npm update membraneThis will fetch the latest commit from the main branch and update your package-lock.json.
Note: After updating, you should commit the updated
package-lock.jsonto keep your deployment in sync. A future release of membrane will be published to npm for easier version management.
Check current vs latest version:
# See what you have installed
npm ls membrane
# See latest on GitHub
git ls-remote https://github.com/antra-tess/membrane.git refs/heads/main | cut -c1-7- Node.js 20+
- TypeScript 5.3+
- Discord bot token (in
discord_tokenfile) - LLM API keys (Anthropic, OpenAI, etc.) (in config files)
If enabled with API_BEARER_TOKEN, the bot exposes a REST API for accessing Discord conversation history.
Health check (no auth required)
curl http://localhost:3000/healthExport Discord conversation history
Authentication: Bearer token required
Request Body:
{
"last": "https://discord.com/channels/GUILD_ID/CHANNEL_ID/MESSAGE_ID",
"first": "https://discord.com/channels/GUILD_ID/CHANNEL_ID/MESSAGE_ID",
"recencyWindow": {
"messages": 400,
"characters": 100000
}
}Example:
curl -X POST http://localhost:3000/api/messages/export \
-H "Authorization: Bearer your-token-here" \
-H "Content-Type: application/json" \
-d '{"last": "https://discord.com/channels/123/456/789"}'ChapterX is developed to be compatible and interoperable with the chapter2. Many critical concepts, including the use of Discord as the single source of truth, supporting real-time configuration via pinned Discord messages and other, have been pioneered in chapter2 by Janus and ampdot/joysatisficer.
MIT