Replies: 4 comments 3 replies
-
|
Thank you for your active support! ollama does seem like a good one. Depending on the changes, I'm considering adding this as an optional pip flag Would you offer more info on the changes involved? |
Beta Was this translation helpful? Give feedback.
-
|
You can see the changes in my fork. The changes in GPT.py were already updated in your repo, the base_node change is just a personal preference. The Ollama changes are all in the llm_api folder, the init and utils get updated, and the new ollama.py. I added 2 new optional parameters to the get_query function, but it could work fine with environment variables, following the pattern of the Azure and OpenAI details. Thank you! |
Beta Was this translation helpful? Give feedback.
-
|
Thanks for the clarifications. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
-
I created an ollama.py script that handles an Ollama client and it works fine as an integration. It does require a few other file changes. Any thoughts about adding support for Ollama? I thought Ollama was better than individual local LLM support because many different versions can all be supported with one class. Thank you!
Beta Was this translation helpful? Give feedback.
All reactions