Skip to content

A Streamlit-powered chatbot application that supports both OpenAI GPT and Ollama models, with integrated Arxiv paper retrieval as a tool.

Notifications You must be signed in to change notification settings

wangychn/streamlit-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chatbot Basic

A Streamlit-powered chatbot application that supports both OpenAI GPT and Ollama models, with integrated Arxiv paper retrieval as a tool.

Features

  • Conversational AI: Chat with an assistant powered by either OpenAI GPT-4o or Ollama (Llama3.2).
  • Arxiv Paper Retrieval: Use the retrieve tool to fetch and summarize academic papers from Arxiv.
  • Streaming Responses: AI responses are streamed for a smooth chat experience.
  • Model Switching: Subtitle displays which model is currently active.
  • LangSmith Tracing: Optional tracing for debugging and analytics.

Picture of the streamlit UI chat window

Setup

1.Clone the repository:

   git clone https://github.com/wangychn/streamlit-chatbot.git
   cd streamlit-chatbot

2.Install dependencies:

   pip install -r requirements.txt

3.Set environment variables:

  • Create a .env file in the root directory:
   OPENAI_API_KEY=your-openai-key
   LANGCHAIN_API_KEY=your-langsmith-key
  • (Optional) Set additional LangSmith tracing variables if needed.

4.Start Ollama (if using Ollama):

  • Make sure Ollama is running locally and the desired model (e.g., llama3.2) is available.

Usage

Run the Streamlit app:

streamlit run src/streamlit_app.py
  • The app will display a chat interface.
  • The subtitle will show which model is active (GPT-4o or Ollama).
  • Type your questions in the chat input.
  • To retrieve papers, mention your need to investigate papers in your message.

File Structure

src/
  chatbot.py         # Chatbot logic and graph construction
  streamlit_app.py   # Streamlit UI and app entry point

Customization

  • Switch models: Change the llm initialization in streamlit_app.py to use either ChatOpenAI or ChatOllama.
  • Add tools: Extend chatbot.py to add more tools or retrieval functions.

Troubleshooting

  • Session state errors: Always run the app with streamlit run, not plain python.
  • Model errors: Ensure your API keys and Ollama server are set up correctly.
  • Streaming issues: Input is disabled while the AI is generating a response to prevent interruptions.

About

A Streamlit-powered chatbot application that supports both OpenAI GPT and Ollama models, with integrated Arxiv paper retrieval as a tool.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages