Skip to content
Discussion options

You must be logged in to vote

LLM_API_FUNCTION can be any LLM API function that takes msg:list and shrink_idx:int, and outputs llm_result:str and usage:dict. Where msg is a prompt (OpenAI format by default), and shrink_idx:int is an index at which the LLM should reduce the length of the prompt in case of overflow.

You can write your custom LLM function by extending or following the format of https://github.com/Holmeswww/AgentKit/blob/main/src/agentkit/llm_api/base.py

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by Holmeswww
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #20 on May 07, 2024 05:35.