CloudInsight Extractor is a Python package designed to analyze and extract structured insights from user-submitted summaries or descriptions related to cloud infrastructure topics. Given an input text about a specific subject like cloud infrastructure developments, the package uses a language model to identify key themes, trends, and innovative points, presenting the results in a clear, organized format. This helps users quickly grasp essential information without manually sifting through lengthy descriptions, facilitating better understanding and decision-making in technical domains.
To install the CloudInsight Extractor package, run the following command:
pip install cloudinsight_extractorHere is an example of how to use the cloudinsight_extractor function:
from cloudinsight_extractor import cloudinsight_extractor
# Example usage with default LLM7
response = cloudinsight_extractor(user_input="Your input text here")
print(response)You can also use a custom LLM instance from LangChain. Here are examples for different LLM providers:
from langchain_openai import ChatOpenAI
from cloudinsight_extractor import cloudinsight_extractor
llm = ChatOpenAI()
response = cloudinsight_extractor(user_input="Your input text here", llm=llm)
print(response)from langchain_anthropic import ChatAnthropic
from cloudinsight_extractor import cloudinsight_extractor
llm = ChatAnthropic()
response = cloudinsight_extractor(user_input="Your input text here", llm=llm)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from cloudinsight_extractor import cloudinsight_extractor
llm = ChatGoogleGenerativeAI()
response = cloudinsight_extractor(user_input="Your input text here", llm=llm)
print(response)user_input(str): The user input text to process.llm(Optional[BaseChatModel]): The LangChain LLM instance to use. If not provided, the defaultChatLLM7will be used.api_key(Optional[str]): The API key for LLM7. If not provided, the package will use the environment variableLLM7_API_KEYor a default value.
By default, the package uses ChatLLM7 from the langchain_llm7 package. You can find more information about ChatLLM7 here.
The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you want higher rate limits for LLM7, you can pass your own API key via the environment variable LLM7_API_KEY or directly via the api_key parameter. You can get a free API key by registering at LLM7.
If you encounter any issues or have suggestions, please open an issue on the GitHub repository.
- Eugene Evstafev
- Email: hi@eugene.plus
- GitHub: chigwell