Magellai is a command-line tool for interacting with Large Language Models (LLMs), designed with Unix philosophy in mind. It aims to be a first-class Unix citizen, blending the deterministic nature of the command line with the probabilistic power of AI.
- Unix-style command interface - Follows established CLI patterns for intuitive use
- Pipeline integration - Works seamlessly with Unix pipes and filters
- Interactive REPL - Chat with models in a responsive interactive shell
- Extensible plugin system - Extend functionality without modifying core code
- Lua scripting support - Create custom workflows with embedded scripting
- Configuration profiles - Manage different settings for different contexts
- Agent capabilities - Run autonomous agents with specific capabilities
- Workflow management - Define, run, and visualize multi-step processes
# Clone the repository
git clone https://github.com/lexlapax/magellai.git
cd magellai
# Build the binary
make build
# Optionally, install to $GOPATH/bin
make install# Simple prompt-response
magellai ask "What is the capital of France?"
# Interactive chat session
magellai chat
# Run with a specific model
magellai ask --model gpt-4 "Explain quantum computing"
# Stream the response
magellai ask --stream "Write a short story about space exploration"
# Process a file
cat document.txt | magellai transform summarize
# Session Management
magellai session list # List all saved sessions
magellai session export my-session ./session.md # Export a session to markdown
magellai chat # Start a chat, then use commands below
/session save important-chat # Save the current chat session
/session load important-chat # Restore a previous chat session
/session delete old-session # Delete a saved session# Set configuration values
magellai config set default.provider openai
magellai config set default.model gpt-4
# List configuration
magellai config list
# Use a specific profile
magellai --profile work ask "Draft an email to the team"Magellai follows a layered architecture:
- Presentation Layer - CLI commands, flags, REPL
- Application Core Layer - Orchestrates workflows, manages state
- Engine & Extensibility Layer - Manages plugins, provides scripting
- LLM Integration Layer - Adapts the underlying
go-llmslibrary - Infrastructure Layer - Configuration, logging, I/O, file system
Magellai is built using:
- urfave/cli for command-line handling
- koanf for configuration management
- slog for structured logging
- go-llms for LLM interactions
# Get dependencies
go mod tidy
# Build
make build
# Run tests
make test
# Lint code
make lintContributions are welcome! Please feel free to submit a Pull Request.