A terminal user interface for managing Ollama models. Built with Python and Textual.
The Ollama CLI is powerful but lacks a way to browse and discover models. This TUI solves that by scraping the Ollama registry and providing:
- Full model catalog - Browse 200+ models with descriptions and parameter sizes
- Smart search - Real-time filtering to find the model you need
- Version selection - See all available versions with download sizes before pulling
- One-key actions - Pull, delete, stop models with single keypresses
No more guessing model names or checking the website - everything is accessible from your terminal.
Browse the complete Ollama registry with real-time filtering. See model parameters and descriptions at a glance.
Type to instantly filter the model list. Find what you need in seconds.
When pulling a model, choose the specific version you need. See download sizes before committing.
List, inspect, and delete your downloaded models.
See which models are loaded in memory with auto-refresh. Stop them when done.
- Browse 200+ models from the Ollama registry
- See parameter sizes (7b, 13b, 70b, etc.) and descriptions
- Real-time filtering - type to search instantly
- Version selection - choose specific tags with download sizes
- Smart caching - 24h cache reduces network requests
- List all downloaded models with size and modification date
- View detailed model information (architecture, parameters, license)
- Delete models with confirmation
- Monitor loaded models with GPU/CPU usage
- Auto-refresh every 5 seconds
- Stop running models to free memory
- Keyboard-first - Full navigation without mouse
- Visual progress - Progress bar during downloads
- Tab navigation - Switch views with 1/2/3 or arrow keys
- Python 3.10+
- Ollama installed and running
curl -fsSL https://raw.githubusercontent.com/elmisi/ollama-cli-tui/main/install.sh | shThis clones the repo to ~/.local/share/ollama-tui/, creates a virtual environment, and installs the ollama-tui command to ~/.local/bin/. Running it again will update an existing installation.
git clone https://github.com/elmisi/ollama-cli-tui.git
cd ollama-cli-tui
./install.shcurl -fsSL https://raw.githubusercontent.com/elmisi/ollama-cli-tui/main/uninstall.sh | shOr if installed from a local clone:
./uninstall.shollama-tui # Run the TUI
ollama-tui --flush-cache # Clear cached registry data
ollama-tui --version # Show versionIf running from source:
./run.py
# or
PYTHONPATH=src python -m ollama_tuiRegistry data is cached in ~/.cache/ollama-tui/ for 24 hours to reduce network requests. Use --flush-cache to force a refresh.
| Key | Action |
|---|---|
1 |
Switch to Models tab |
2 |
Switch to Running tab |
3 |
Switch to Search tab |
← → |
Navigate between tabs |
q |
Quit |
| Key | Action |
|---|---|
↑ ↓ |
Navigate list |
Enter |
Show model info |
d |
Delete model |
r |
Refresh list |
| Key | Action |
|---|---|
↑ ↓ |
Navigate list |
s |
Stop model |
r |
Refresh list |
| Key | Action |
|---|---|
↑ ↓ |
Navigate list |
/ |
Focus search input |
p |
Pull selected model |
Enter |
Select version (in dialog) |
Escape |
Back to list / Cancel |
r |
Refresh from registry |
Since Ollama doesn't provide a public API for browsing models, this tool scrapes the Ollama Library page to fetch:
- Model names and descriptions
- Available parameter sizes
- Version tags with download sizes
Data is cached locally for 24 hours to be respectful of Ollama's servers.
MIT License - see LICENSE for details.