A Hybrid Memory Framework for LLMs & AI Agents.
Mem1 is an advanced memory layer designed to give AI agents long-term, consistent, and structured memory. It goes beyond simple vector retrieval by integrating a Knowledge Graph to capture relationships and a Vector Database for semantic search.
This project is an implementation inspired by the Mem0 research paper, extended with a robust GraphRAG approach to handle entity relationships (e.g., User -> working on -> Project).
- Hybrid Memory Architecture: Combines Qdrant (Vector DB) for semantic similarity with Memgraph (Graph DB) for structural associativity.
- GraphRAG Capabilities: Automatically extracts entities (Person, Project, Tool) and relationships from user conversations.
- Smart Entity Resolution: Uses Fuzzy Search + LLM verification to de-duplicate entities (e.g., mapping "JS" and "Node" to
JavaScript). - Dynamic Fact Management: Intelligently decides whether to
ADDa new fact,UPDATEan existing one, orIGNOREredundancy. - Deep Context Retrieval: Performs 2-hop graph traversals to fetch context that is structurally related but might not be semantically similar.
- Observability: Integrated with Langfuse for tracing and monitoring agent performance.
- Language: Python 3.11+
- Vector Database: Qdrant
- Graph Database: Memgraph (Neo4j compatible)
- App Database: MongoDB (via Beanie/Motor)
- Embeddings: HuggingFace Text Embeddings Inference (TEI)
- Environment: Nix & Docker Compose
- Dependency Management:
uv
When a user sends a message, Mem1 processes it through the following pipeline:
- Summarization: Compresses recent chat history.
- Fact Extraction: Identifies new candidate facts.
- Semantic Retrieval: Queries Qdrant for similar past memories.
- Graph Retrieval: Traverses Memgraph to find related entities (2-hop neighborhood).
- Memory Write:
- Updates Vector DB with the new fact.
- Extracts triplets (Subject, Predicate, Object).
- Resolves entities (Entity Resolution) and upserts them into the Knowledge Graph.
- Docker & Docker Compose (Essential for running DBs)
- Nix (Optional, for reproducible dev environment)
- Just (Command runner)
Clone the repository and set up your environment variables.
cp .env.example .envMake sure to populate .env with your API keys (OpenAI, etc.) and configuration preferences.
If you are using Nix:
nix developIf you are using standard Python:
pip install uv
uv syncStart the underlying services (Qdrant, Memgraph, MongoDB, Langfuse, TEI) using Docker Compose. We use just for convenience.
just startOnce the Docker services are healthy, start the main application:
just appTo view the application logs:
just logsMem1 includes Memgraph Lab to visualize your agent's Knowledge Graph in real-time.
-
Open your browser to http://localhost:3001.
-
Connection Details:
-
Host:
localhost -
Port:
7687(or 7688 if configured differently in docker-compose) -
Username/Password: See your docker-compose.yml (Default: memgraph / memg_password)
-
-
Run a Query: To see the entities created by your agent:
MATCH (n)-[r]->(m) RETURN n, r, m LIMIT 100;Thanks :)