Skip to content

lexlapax/magellai-old

Repository files navigation

Magellai

Magellai is a command-line tool for interacting with Large Language Models (LLMs), designed with Unix philosophy in mind. It aims to be a first-class Unix citizen, blending the deterministic nature of the command line with the probabilistic power of AI.

Features

  • Unix-style command interface - Follows established CLI patterns for intuitive use
  • Pipeline integration - Works seamlessly with Unix pipes and filters
  • Interactive REPL - Chat with models in a responsive interactive shell
  • Extensible plugin system - Extend functionality without modifying core code
  • Lua scripting support - Create custom workflows with embedded scripting
  • Configuration profiles - Manage different settings for different contexts
  • Agent capabilities - Run autonomous agents with specific capabilities
  • Workflow management - Define, run, and visualize multi-step processes

Installation

From Source

# Clone the repository
git clone https://github.com/lexlapax/magellai.git
cd magellai

# Build the binary
make build

# Optionally, install to $GOPATH/bin
make install

Usage

Basic Commands

# Simple prompt-response
magellai ask "What is the capital of France?"

# Interactive chat session
magellai chat

# Run with a specific model
magellai ask --model gpt-4 "Explain quantum computing"

# Stream the response
magellai ask --stream "Write a short story about space exploration"

# Process a file
cat document.txt | magellai transform summarize

# Session Management
magellai session list                               # List all saved sessions
magellai session export my-session ./session.md     # Export a session to markdown
magellai chat                                       # Start a chat, then use commands below
/session save important-chat                        # Save the current chat session
/session load important-chat                        # Restore a previous chat session
/session delete old-session                         # Delete a saved session

Configuration

# Set configuration values
magellai config set default.provider openai
magellai config set default.model gpt-4

# List configuration
magellai config list

# Use a specific profile
magellai --profile work ask "Draft an email to the team"

Architecture

Magellai follows a layered architecture:

  1. Presentation Layer - CLI commands, flags, REPL
  2. Application Core Layer - Orchestrates workflows, manages state
  3. Engine & Extensibility Layer - Manages plugins, provides scripting
  4. LLM Integration Layer - Adapts the underlying go-llms library
  5. Infrastructure Layer - Configuration, logging, I/O, file system

Development

Magellai is built using:

  • urfave/cli for command-line handling
  • koanf for configuration management
  • slog for structured logging
  • go-llms for LLM interactions

Building from Source

# Get dependencies
go mod tidy

# Build
make build

# Run tests
make test

# Lint code
make lint

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License

About

personal scriptable ai assistant with library - a play on mage, magellan, go, ai and llm

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors