This project demonstrates an AI agent built using LangChain, LangGraph, and Ollama. The agent can answer questions based on a provided resume (stored in data/cv.txt) using Retrieval-Augmented Generation (RAG) or search the web for external information using DuckDuckGo.
The following diagram illustrates the logical flow of the agent:
- Conversational Agent: Interacts with the user via a command-line interface.
- Tool Use: Dynamically decides whether to use a tool based on the user's query.
- Resume RAG: Answers questions about resume content by embedding text chunks and retrieving relevant information.
- Web Search: Uses DuckDuckGo to find information about topics not covered in the resume.
- Ollama Integration: Leverages local Ollama models for both language generation and embeddings, ensuring privacy and offline capability.
- Modular Tools: Tool logic is separated into a
toolsdirectory.
- Conda: Ensure you have Conda installed. This is used for managing the Python environment. You can install it from Anaconda Distribution or Miniconda.
- Ollama: Install Ollama by following the instructions on the Ollama website.
-
Clone the Repository (Optional): If you haven't already, clone the repository:
git clone <your-repo-url> cd <repository-directory>
-
Install and Run Ollama Models: Make sure the Ollama application/server is running. Then, pull the required models (LLM and embedding model):
ollama pull llama3.2 # The LLM used by the agent (can be changed in agent.py) ollama pull nomic-embed-text # The embedding model used for RAG
Important Note: Ollama must be running in the background before starting the agent.
-
Create and Activate Conda Environment: Create a new conda environment (e.g., named
agent-env) with a specific Python version (tested with 3.12):conda create -n agent-env python=3.12 -y conda activate agent-env
-
Install Python Dependencies: Install the required Python packages using
pipand therequirements.txtfile:pip install -r requirements.txt
-
Prepare Resume File: Place your resume content in a file named
cv.txtin thedatadirectory of the project. A sample file is included, but you should replace it with your actual resume text for the RAG tool to work correctly.
- Ensure Ollama is running.
- Activate the Conda environment:
conda activate agent-env
- Run the agent script:
python agent.py
Once the agent starts, it will prompt you for input. You can ask questions like:
- "What was my most recent job?" (Should trigger the
resume_rag_tool) - "Tell me about LifeMine Therapeutics." (Should trigger the
web_search_tool) - "What is LangGraph?" (Should trigger the
web_search_toolor be answered directly by the LLM)
Type quit or exit to end the conversation.
.
├── agent.py # Main agent script with LangGraph logic
├── data/ # Directory for data files
│ └── cv.txt # Resume file (replace with your own)
├── docs/ # Documentation assets
│ └── agent_graph.png # Agent architecture diagram
├── requirements.txt # Python dependencies
├── tools/ # Directory for tool implementations
│ ├── __init__.py
│ ├── resume_rag.py # RAG tool implementation
│ └── web_search.py # Web search tool implementation
└── README.md # This file
