Replies: 2 comments 1 reply
-
Which Ollama model are you using? They are not all supporting function calls. |
Beta Was this translation helpful? Give feedback.
-
The error For local models like Ollama, make sure you are using the latest version of the pip install -U llama-index llama-index-core llama-index-llms-ollama If you are using a different local LLM, ensure it implements the required methods ( If you are still seeing the error after updating, double-check that your LLM initialization in the tutorial script matches the expected pattern for function calling. For Ollama, it should look something like: from llama_index.llms.ollama import Ollama
llm = Ollama(
model="your-model-name",
is_function_calling_model=True,
# ...other params
) If you are using a different agent or workflow, or a non-OpenAI LLM, you may need to use a different agent class (such as Let me know if you need help with a specific model or code snippet! To reply, just mention my name @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
When running the workflow in this tutorial:
https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/
it runs to the following error:
llama_index.core.workflow.errors.WorkflowRuntimeError: Error in step 'run_agent_step': LLM must be a FunctionCallingLLM
Beta Was this translation helpful? Give feedback.
All reactions