Skip to content

chigwell/websumm-agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

websumm-agent

PyPI version License: MIT Downloads LinkedIn

A Python package that helps agents efficiently retrieve and summarize information from the web without excessive token usage. The package takes a user's query as input and returns a structured response containing the most relevant and concise information fetched from the web. It ensures that the output is well-formatted and adheres to specific patterns, making it easy for agents to process and utilize the information.

Installation

pip install websumm_agent

Usage

Basic Usage

from websumm_agent import websumm_agent

user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input)
print(response)

Using a Custom LLM

You can pass your own LangChain-compatible LLM instance to use with different providers:

OpenAI

from langchain_openai import ChatOpenAI
from websumm_agent import websumm_agent

llm = ChatOpenAI()
user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input, llm=llm)

Anthropic

from langchain_anthropic import ChatAnthropic
from websumm_agent import websumm_agent

llm = ChatAnthropic()
user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input, llm=llm)

Google Generative AI

from langchain_google_genai import ChatGoogleGenerativeAI
from websumm_agent import websumm_agent

llm = ChatGoogleGenerativeAI()
user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input, llm=llm)

Using a Custom API Key

from websumm_agent import websumm_agent

user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input, api_key="your_llm7_api_key_here")

Parameters

  • user_input (str) - The user input text to process
  • llm (Optional[BaseChatModel]) - The LangChain LLM instance to use. If not provided, the default ChatLLM7 will be used.
  • api_key (Optional[str]) - The API key for LLM7. If not provided, the package will look for the LLM7_API_KEY environment variable.

Default LLM Provider

By default, the package uses ChatLLM7 from langchain-llm7. The default rate limits for LLM7 free tier are sufficient for most use cases. For higher rate limits, you can:

  1. Set the LLM7_API_KEY environment variable
  2. Pass your API key directly to the function
  3. Get a free API key by registering at https://token.llm7.io/

Contributing

Issues and contributions are welcome! Please submit them through the GitHub repository.

Author

Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell