A Python package that helps agents efficiently retrieve and summarize information from the web without excessive token usage. The package takes a user's query as input and returns a structured response containing the most relevant and concise information fetched from the web. It ensures that the output is well-formatted and adheres to specific patterns, making it easy for agents to process and utilize the information.
pip install websumm_agentfrom websumm_agent import websumm_agent
user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input)
print(response)You can pass your own LangChain-compatible LLM instance to use with different providers:
from langchain_openai import ChatOpenAI
from websumm_agent import websumm_agent
llm = ChatOpenAI()
user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input, llm=llm)from langchain_anthropic import ChatAnthropic
from websumm_agent import websumm_agent
llm = ChatAnthropic()
user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input, llm=llm)from langchain_google_genai import ChatGoogleGenerativeAI
from websumm_agent import websumm_agent
llm = ChatGoogleGenerativeAI()
user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input, llm=llm)from websumm_agent import websumm_agent
user_input = "What are the latest developments in quantum computing?"
response = websumm_agent(user_input, api_key="your_llm7_api_key_here")user_input(str) - The user input text to processllm(Optional[BaseChatModel]) - The LangChain LLM instance to use. If not provided, the default ChatLLM7 will be used.api_key(Optional[str]) - The API key for LLM7. If not provided, the package will look for theLLM7_API_KEYenvironment variable.
By default, the package uses ChatLLM7 from langchain-llm7. The default rate limits for LLM7 free tier are sufficient for most use cases. For higher rate limits, you can:
- Set the
LLM7_API_KEYenvironment variable - Pass your API key directly to the function
- Get a free API key by registering at https://token.llm7.io/
Issues and contributions are welcome! Please submit them through the GitHub repository.
Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell