GitChat is an open-source application that lets you chat with public GitHub repositories using Large Language Models (LLMs) like OpenAI's GPT. You can ask questions about any ingested repository, and the system will fetch relevant context from its codebase using RAG (Retrieval-Augmented Generation).
- 🗂 Ingest any public GitHub repo and store its code in a vector database.
- 🔍 Semantic search for relevant code snippets using embeddings.
- 💬 Chat interface with streaming responses from LLMs.
- 📦 Centralized vector store for all repos (shared across users).
- 📑 Metadata management via MongoDB.
- 🔄 Multi-model support — OpenAI, Google AI (future: Anthropic, local LLMs).
- ⚡ FastAPI backend + Streamlit frontend.
┌────────────────┐
│ Streamlit UI │
└───────┬────────┘
│
┌───────▼──────────┐
│ FastAPI API │
├────────┬─────────┤
│ /gh │ /chat │
└────────┴─────────┘
│
┌───────────▼───────────────────────────┐
│ Business Logic Layer │
│ - GithubManager (repo cloning, etc.) │
│ - WeaviateManager (vector store) │
│ - RAGQueryEngine (context + LLM) │
└───────────┬───────────────────────────┘
│
┌───────────▼──────────────┐
│ MongoDB (Metadata) │
├──────────────────────────┤
│ Weaviate (Vector DB) │
└──────────────────────────┘
.
├── backend
│ ├── app
│ │ ├── api
│ │ ├── __init__.py
│ │ ├── main.py
│ │ └── schemas/
│ ├── config.py
│ ├── database.py
│ ├── docker-compose.yaml
│ ├── __init__.py
│ ├── logger.py
│ ├── pyproject.toml
│ ├── utils
│ │ ├── embedder.py
│ │ ├── github_manager.py
│ │ ├── rag_engine.py
│ │ └── weaviate_manager.py
│ └── uv.lock
├── frontend
│ ├── main.py
│ ├── pages
│ │ └── chat.py
│ ├── pyproject.toml
│ ├── utils.py
│ └── uv.lock
├── logs
│ └── backend.log
├── mcp
└── README.md
git clone https://github.com/yourusername/gitchat.git
cd gitchat- Create a virtual environment
cd backend
python -m venv venv
source venv/bin/activate # Mac/Linux
venv\Scripts\activate # Windows-
Setup
.envaccording to.env.example. -
Install the deps from
pyproject.toml. (Or just run the code usinguv run ...to have the deps downloaded for the first time) -
Run the backend server
uvicorn app.main:app-
Install the deps from
pyproject.toml. (Or just run the code usinguv run ...to have the deps downloaded for the first time) -
Run the frontend app
cd frontend
streamlit run main.py