A lightweight CLI client for the Linux MCP server. This tool runs on headless servers without requiring X11 or other display libraries, enabling interactive Linux system diagnostics directly from the terminal.
- 🤖 Interactive chat interface with AI-powered troubleshooting
- 🔍 Read-only system diagnostics via Linux MCP server
- 🎨 Rich console UI with color-coded output
- ⚙️ Environment-based configuration
- 🐳 Dev container support for easy setup
- Python 3.10+
- Ollama running locally (default: http://localhost:11434)
- linux-mcp-server installed
- Open project in VS Code
- Reopen in container when prompted
- Wait for automatic setup to complete
# Create virtual environment
python -m venv .venv
source .venv/bin/activate
# Install dependencies
pip install -r requirements.txt
pip install linux-mcp-serverConfigure via environment variables:
| Variable | Default | Description |
|---|---|---|
LINUX_MCP_SERVER |
.venv/bin/linux-mcp-server |
Path to MCP server executable |
LLM_PROVIDER |
ollama |
LLM backend (ollama or googlegenai) |
LLM_MODEL |
llama3.2:latest |
Model name for the selected backend |
LLM_BASE_URL |
http://localhost:11434 |
Ollama server URL (used when LLM_PROVIDER=ollama) |
REQUEST_TIMEOUT |
90 |
Request timeout in seconds |
Use one of the following configurations before starting the agent.
export LLM_PROVIDER=ollama
export LLM_MODEL=llama3.2:latest
export LLM_BASE_URL=http://localhost:11434export LLM_PROVIDER=googlegenai
export LLM_MODEL=gemini-2.0-flash
export GOOGLE_API_KEY=your_api_key_here# As a module
python -m linux_mcp_agent
# Or directly
python linux_mcp_agentYou: What is the free disk space?
Agent: [Analyzes system and provides disk usage information]
You: Check if nginx is running
Agent: [Queries service status and reports findings]
You: exit
Goodbye!
make install # Install dependencies
make format # Format code with black
make lint # Run flake8 and mypy
make test # Run pytest
make all # Run format, lint, and test
make clean # Remove cache fileslinux_mcp_agent/
├── __init__.py # Package initialization
├── __main__.py # Entry point
├── agent.py # Main agent logic
└── config.py # Configuration management
tests/
├── __init__.py
└── test_config.py # Configuration tests
The agent uses:
- LlamaIndex: Framework for LLM-powered applications
- MCP (Model Context Protocol): Standard protocol for tool access
- Ollama: Local LLM inference
- Rich: Terminal UI formatting
The agent operates in read-only mode, preventing system modifications.
See LICENSE file for details.