Skip to content

feat: ollama integration#76

Open
gamedevCloudy wants to merge 2 commits intoSamuelSchmidgall:mainfrom
gamedevCloudy:main
Open

feat: ollama integration#76
gamedevCloudy wants to merge 2 commits intoSamuelSchmidgall:mainfrom
gamedevCloudy:main

Conversation

@gamedevCloudy
Copy link

Integrated Ollama to run the repo locally.

Usage

Run ollama list in the terminal to get a list of models you have:

Screenshot 2025-02-24 at 12 28 31

Running the repo:

  • Add prefix ollama: to the model name you want to use via ollama like this ollama:model_name
  • Example:
python ai_lab_repo.py --llm-backend "ollama:deepseek-r1:1.5b" --research-topic "YOUR RESEARCH IDEA"

Changes Made:

  • ollama integration in inference.py
  • moved token and cost computation to a separate function, please let me know if this is not intended.
  • updated README and requirements likewise.

Notes:

  • I have skipped token computation for models run via ollama. This is becuase users can have multiple models installed and can have different encodings. This may not be supported by tiktoken.
    Please let me know if I am mistaken here. I will make changes accordingly. Thanks.

@gamedevCloudy gamedevCloudy changed the title ollama integration feat: ollama integration Feb 24, 2025
@LigetiCJ
Copy link

LigetiCJ commented Mar 6, 2025

The only issue I've ran into is you still need to provide an API key. Please add a bypass for the API key and then you're golden!

@gamedevCloudy
Copy link
Author

The only issue I've ran into is you still need to provide an API key. Please add a bypass for the API key and then you're golden!

I'll add a bypass.
I had been testing with OpenAI's key to ensure I did not break the existing implementation, that's why I didn't notice this.

Thanks for pointing it out - I'll make this change.

@gamedevCloudy
Copy link
Author

patch: Bypassed API Key check when using Ollama.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants