An MCP (Model Context Protocol) server backed by PostgreSQL + pgvector that provides enterprise architecture context and institutional memory for LLM-assisted decision-making in banking.
EA-Brain is a contextual memory system for enterprise architects. It stores architecture knowledge — principles, decisions, heuristics, domain blueprints, regulatory context, stakeholder intelligence — in a vector database and exposes it via MCP tools that any LLM client can call.
When an LLM is asked to help with an architecture decision, it can:
- Search the knowledge base for relevant context
- Retrieve precedents — prior ADRs and decision outcomes
- Surface heuristics — watch-outs and lessons learned
- Build context bundles — multi-category briefing packs
- Use prompt patterns — consistent templates for architecture tasks
┌──────────────────┐ MCP (stdio) ┌──────────────────┐
│ LLM Client │◄──────────────────►│ EA-Brain MCP │
│ (Claude, etc.) │ │ Server │
└──────────────────┘ └────────┬─────────┘
│
┌────────▼─────────┐
│ PostgreSQL + │
│ pgvector │
└────────┬─────────┘
│
┌────────▼─────────┐
│ ea-brain/ │
│ (markdown KB) │
└──────────────────┘
- Node.js ≥ 20
- Docker (for PostgreSQL)
- An OpenAI API key (or Ollama for local embeddings)
npm installdocker compose up -dcp .env.example .env
# Edit .env and set your OPENAI_API_KEYnpm run db:migratenpm run ingest:allnpm run buildClaude Desktop — add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"ea-brain": {
"command": "node",
"args": ["/Users/desmoriarty/OneDrive/Code/EABrain/dist/server.js"],
"env": {
"DATABASE_URL": "postgresql://ea_brain:ea_brain_secret@localhost:5432/ea_brain",
"OPENAI_API_KEY": "sk-your-key-here",
"EMBEDDING_PROVIDER": "openai",
"EMBEDDING_MODEL": "text-embedding-3-small",
"EMBEDDING_DIMENSIONS": "1536"
}
}
}
}VS Code Copilot — the .vscode/mcp.json is already configured. Set OPENAI_API_KEY in your environment.
| Tool | Description |
|---|---|
ea_search |
Semantic search across the entire knowledge base |
ea_search_chunks |
Fine-grained chunk-level search for large documents |
ea_get_context_bundle |
Multi-category briefing pack for a decision topic |
ea_list_categories |
List all knowledge categories with descriptions |
ea_get_precedents |
Find prior ADRs and decision session outcomes |
ea_get_heuristics |
Surface watch-outs, anti-patterns, and lessons learned |
ea_get_prompt_pattern |
Retrieve reusable architecture prompt templates |
| Resource | Description |
|---|---|
ea-brain://categories |
All knowledge categories |
ea-brain://stats |
Document counts and token totals per category |
ea-brain://category/{id} |
Documents in a specific category |
ea-brain://document/{path} |
A specific document by source path |
ea-brain/
├── 00-group-strategy-and-enterprise-context/
├── 01-operating-model-and-governance/
├── 02-enterprise-principles/
├── 03-domain-blueprints/
├── 04-platforms-and-systems/
├── 05-data-and-information-architecture/
├── 06-integration-and-eventing/
├── 07-risk-regulation-and-controls/
├── 08-ai-automation-and-decision-support/
├── 09-architecture-decisions-adrs/
├── 10-stakeholder-intelligence/
├── 11-heuristics-and-watchouts/
├── 12-prompt-patterns/
├── 13-tribe-blueprints-and-roadmaps/
├── 14-decision-sessions-and-paper-history/
├── 15-reference-material/
└── index.md
Add markdown files to any category folder. Use frontmatter for metadata:
---
title: My Architecture Document
category: 02-enterprise-principles
tags: [principles, cloud, API]
---Then re-ingest: npm run ingest:all or npm run ingest -- --category 02-enterprise-principles
npm run dev # Watch mode with tsx
npm run build # Compile TypeScript
npm test # Run tests| Provider | Config | Notes |
|---|---|---|
| OpenAI | EMBEDDING_PROVIDER=openai |
Best quality, requires API key |
| Azure OpenAI | EMBEDDING_PROVIDER=azure |
For Azure-hosted deployments |
| Ollama | EMBEDDING_PROVIDER=ollama |
Free, local, good with nomic-embed-text |
MIT