A simple python script to talk to an Ollama install on your machine.
I gave it a more meaningful name.
I had an idea for a trivial task so I put copilot on it. Then I fiddled with it some more.
Perhaps I dream of making this a local instance of warp.dev? Do I dream?
This script lets you send a question to an Ollama model running locally. It will attempt to start Ollama if it is not already running.
python3 eyeball_jar.py "Your question here"You can specify the model to use with the --model flag:
python3 eyeball_jar.py "Your question here" --model mistral:7bThe model setting will be saved in settings.cfg and used as the default for future runs. You can also edit settings.cfg directly:
[ollama]
model = mistral:7bAdd --interactive to enter a chat loop with the Ollama model. Each prompt is sent as a new question, and you can exit by typing exit or quit.
python3 eyeball_jar.py --interactiveYou can combine with --model and --debug:
python3 eyeball_jar.py --interactive --model mistral:7b --debugAdd --web to use DuckDuckGo web search results as context for your question. The script will attempt to use ddgr to fetch search results and summarize them with your Ollama model. If ddgr is unavailable or rate-limited, it will pass the DuckDuckGo search results URL to Ollama for summarization.
Requirements:
- ddgr must be installed and available in your PATH for best results.
Usage:
python3 eyeball_jar.py "Your question here" --webExample:
python3 eyeball_jar.py "what is Rasha recovery?" --webNotes:
- If DuckDuckGo blocks automated queries, the script will fall back to summarizing the search results page URL.
- Use
--debugor--verbosefor troubleshooting and to see detailed output.
Add --debug to print extra information about the Ollama CLI call:
python3 eyeball_jar.py "Your question here" --debugpython3 eyeball_jar.py "name all the albums released by the band Genesis"I will probably work on this more when I have free time. Or you can help?