This is an interactive web application for context-aware question answering over documents. It leverages semantic search, LangGraph workflow, and ChatGroq LLM to provide concise, source-backed answers to user queries.
- Contextual Answers: Retrieves the most relevant passages from your documents and generates answers using an LLM.
- Conversation Memory: Maintains chat history for follow-up questions using LangGraph's in-memory checkpointer.
- Source Referencing: Displays the source document and page numbers for transparency.
- Easy-to-Use UI: Built with Streamlit for a responsive chat experience.
- Robust Error Handling: Gracefully handles missing API keys, empty vector DBs, and other runtime errors.
This project uses uv as the python package manager
curl -LsSf https://astral.sh/uv/install.sh | sh
echo 'export PATH="$HOME/snap/code/221/.local/bin:$PATH"' >> ~/.bashrc && source ~/.bashrc
uv --version
https://github.com/yoursrealkiran/RAG.git
cd RAG
uv venv
source .venv/bin/activate
uv sync
Run the data ingestion script to prepare your data:
uv run document_processing.py
Paste your Groq API key in the .env file
Launch the Streamlit application:
uv run streamlit run chatbot.py
- Make sure you have the required pdf files in place before running the ingestion script (document_processing.py) and change the
config.pyaccordingly. - The web app will be available in your browser once Streamlit starts.
- To deactivate the virtual environment simply run
deactivate.
