Chat Gipitty (Chat Get Information, Print Information TTY) is a command line client primarily intended for the official OpenAI Chat Completions API. It allows you to chat with language models in a terminal and even pipe output into it. While optimized for OpenAI's ChatGPT (with GPT-4 as the default model), it can also work with other providers that expose OpenAI-compatible endpoints.
Debug a Rust compilation error by piping the build output directly to ChatGPT:
cargo build 2>&1 | cgip "give me a short summary of the kind of error this is"This results in something like:
❯ cargo build 2>&1 | cgip 'give me a short summary of the kind of error this is'
The error you're encountering is a **lifetime error** in Rust, specifically an issue with **borrowed values not living long enough**.You can create useful command line utilities by combining cgip with shell aliases:
# Set up the alias
alias translate='cgip --system-prompt "You are a translator, you translate text to Spanish"'
# Use it
echo "Hello, world!" | translate
echo "Good morning" | translate# Set up the alias
alias review='cgip --system-prompt "You are a senior developer" "review this code for bugs and improvements"'
# Use it
git diff | review
cat src/main.py | review- Universal AI Access: Works with OpenAI, local models via Ollama, Google Gemini, Mistral AI, Anthropic Claude, and any OpenAI-compatible provider
- Intelligent Piping: Pipe command output directly to AI models for instant analysis
- Multi-Modal Support: Text, image analysis, and text-to-speech capabilities
- Session Management: Maintain conversation context across terminal sessions
- Web Search: Get up-to-date information from the internet
- Agentic Workflows: Let the AI execute shell commands to accomplish tasks
- Flexible Configuration: Extensive customization options and provider support
Recommended: Install from crates.io with cargo:
cargo install cgipSet up your OpenAI API key:
export OPENAI_API_KEY=your_key_hereFor custom providers, set the base URL:
# For local Ollama
export OPENAI_BASE_URL=http://localhost:11434/v1
# For other providers
export OPENAI_BASE_URL=https://your-provider.com/v1# Ask a question
cgip "What is the capital of France?"
# Pipe command output for analysis
ls -la | cgip "What can you tell me about these files?"
# Include file content in your query
cgip "explain this code" -f src/main.rsChat GipiTTY works with any service that implements the OpenAI Chat Completions API standard:
- OpenAI (ChatGPT, GPT-4, GPT-3.5, etc.)
- Local models via Ollama
- Google Gemini (via OpenAI-compatible endpoints)
- Mistral AI (via OpenAI-compatible endpoints)
- Anthropic Claude (via OpenAI-compatible endpoints)
- Any other provider implementing the OpenAI Chat Completions API standard
📖 Complete documentation is available here
The documentation includes:
- Installation Guide
- Getting Started
- Core Features
- All Subcommands (image, tts, embedding, agent, etc.)
- Configuration
- Advanced Usage
- Examples
Contributions are welcome! Please see the Contributing Guide and Development Workflow in the documentation.
This project is licensed under the terms specified in LICENSE.md.
