Skip to content

dmoriart/ea-brain

Repository files navigation

EA-Brain — Enterprise Architecture Context Server

An MCP (Model Context Protocol) server backed by PostgreSQL + pgvector that provides enterprise architecture context and institutional memory for LLM-assisted decision-making in banking.

What is EA-Brain?

EA-Brain is a contextual memory system for enterprise architects. It stores architecture knowledge — principles, decisions, heuristics, domain blueprints, regulatory context, stakeholder intelligence — in a vector database and exposes it via MCP tools that any LLM client can call.

When an LLM is asked to help with an architecture decision, it can:

  1. Search the knowledge base for relevant context
  2. Retrieve precedents — prior ADRs and decision outcomes
  3. Surface heuristics — watch-outs and lessons learned
  4. Build context bundles — multi-category briefing packs
  5. Use prompt patterns — consistent templates for architecture tasks

Architecture

┌──────────────────┐     MCP (stdio)     ┌──────────────────┐
│  LLM Client      │◄──────────────────►│  EA-Brain MCP    │
│  (Claude, etc.)  │                     │  Server          │
└──────────────────┘                     └────────┬─────────┘
                                                  │
                                         ┌────────▼─────────┐
                                         │  PostgreSQL +    │
                                         │  pgvector        │
                                         └────────┬─────────┘
                                                  │
                                         ┌────────▼─────────┐
                                         │  ea-brain/       │
                                         │  (markdown KB)   │
                                         └──────────────────┘

Quick Start

Prerequisites

  • Node.js ≥ 20
  • Docker (for PostgreSQL)
  • An OpenAI API key (or Ollama for local embeddings)

1. Install dependencies

npm install

2. Start PostgreSQL with pgvector

docker compose up -d

3. Configure environment

cp .env.example .env
# Edit .env and set your OPENAI_API_KEY

4. Run database migrations

npm run db:migrate

5. Ingest the knowledge base

npm run ingest:all

6. Build the server

npm run build

7. Connect your LLM client

Claude Desktop — add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "ea-brain": {
      "command": "node",
      "args": ["/Users/desmoriarty/OneDrive/Code/EABrain/dist/server.js"],
      "env": {
        "DATABASE_URL": "postgresql://ea_brain:ea_brain_secret@localhost:5432/ea_brain",
        "OPENAI_API_KEY": "sk-your-key-here",
        "EMBEDDING_PROVIDER": "openai",
        "EMBEDDING_MODEL": "text-embedding-3-small",
        "EMBEDDING_DIMENSIONS": "1536"
      }
    }
  }
}

VS Code Copilot — the .vscode/mcp.json is already configured. Set OPENAI_API_KEY in your environment.

MCP Tools

Tool Description
ea_search Semantic search across the entire knowledge base
ea_search_chunks Fine-grained chunk-level search for large documents
ea_get_context_bundle Multi-category briefing pack for a decision topic
ea_list_categories List all knowledge categories with descriptions
ea_get_precedents Find prior ADRs and decision session outcomes
ea_get_heuristics Surface watch-outs, anti-patterns, and lessons learned
ea_get_prompt_pattern Retrieve reusable architecture prompt templates

MCP Resources

Resource Description
ea-brain://categories All knowledge categories
ea-brain://stats Document counts and token totals per category
ea-brain://category/{id} Documents in a specific category
ea-brain://document/{path} A specific document by source path

Knowledge Base Structure

ea-brain/
├── 00-group-strategy-and-enterprise-context/
├── 01-operating-model-and-governance/
├── 02-enterprise-principles/
├── 03-domain-blueprints/
├── 04-platforms-and-systems/
├── 05-data-and-information-architecture/
├── 06-integration-and-eventing/
├── 07-risk-regulation-and-controls/
├── 08-ai-automation-and-decision-support/
├── 09-architecture-decisions-adrs/
├── 10-stakeholder-intelligence/
├── 11-heuristics-and-watchouts/
├── 12-prompt-patterns/
├── 13-tribe-blueprints-and-roadmaps/
├── 14-decision-sessions-and-paper-history/
├── 15-reference-material/
└── index.md

Add markdown files to any category folder. Use frontmatter for metadata:

---
title: My Architecture Document
category: 02-enterprise-principles
tags: [principles, cloud, API]
---

Then re-ingest: npm run ingest:all or npm run ingest -- --category 02-enterprise-principles

Development

npm run dev      # Watch mode with tsx
npm run build    # Compile TypeScript
npm test         # Run tests

Embedding Providers

Provider Config Notes
OpenAI EMBEDDING_PROVIDER=openai Best quality, requires API key
Azure OpenAI EMBEDDING_PROVIDER=azure For Azure-hosted deployments
Ollama EMBEDDING_PROVIDER=ollama Free, local, good with nomic-embed-text

License

MIT

About

Enterprise Architecture contextual memory MCP server backed by pgvector for LLM-assisted decision-making in banking

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors