A 24/7 local AI work assistant built with Go. Gateway, sessions, memory, and tool execution stay on your machine.
Language: 中文 | English
maxclaw is a local AI agent for developers and operators. Core value proposition: low memory footprint, fully local workflow, visual desktop/web UI, and fast onboarding.
- Go backend, resource-efficient runtime: single binary gateway + tool orchestration.
- Fully local workflow: sessions, memory, logs, and tool runs are stored locally.
- Desktop UI + Web UI: visual settings, streaming chat, file preview, and terminal integration.
- Out-of-the-box setup: one-command install and default workspace templates.
SEO keywords: Go AI Agent, local AI assistant, self-hosted AI agent, private AI workflow, desktop AI app, low-memory AI.
- Go-native agent loop and tool system
- Fully local execution path with auditable artifacts
- Desktop UI + Web UI + API on the same port
executionMode=autofor unattended long-running tasksspawnsub-sessions with independent context/model/source and status callbacks- Automatic task titles that summarize sessions without overwriting message content
- Monorepo-aware recursive context discovery (
AGENTS.md/CLAUDE.md) - Multi-channel integrations: Telegram, WhatsApp (Bridge), Discord, WebSocket
- Cron/Once/Every scheduler + daily memory digest
If you are familiar with OpenClaw, maxclaw follows similar local-first principles with a Go-first engineering focus:
- Local-first agent execution and private data boundaries
- Heartbeat context (
memory/heartbeat.md) - Memory layering (
MEMORY.md+HISTORY.md) - Autonomous mode (
executionMode=auto) - Sub-agent task split via
spawn - Monorepo context discovery for multi-module repositories
- Install Go 1.24+ and Node.js 18+
- Build:
make build - Initialize workspace:
./build/maxclaw onboard - Configure: edit
~/.maxclaw/config.json - Run gateway:
./build/maxclaw-gateway -p 18890
Built binaries:
./build/maxclaw: full CLI (onboard,skills,telegram bind,gateway, ...)./build/maxclaw-gateway: standalone backend for desktop packaging or headless use
All-in-one local dev startup:
make build && make restart-daemon && make electron-startCommon dev restart commands:
make dev-gateway
make backend-restart
make dev-electron
make electron-restartcurl -fsSL https://raw.githubusercontent.com/Lichas/maxclaw/main/install.sh | bashPath: ~/.maxclaw/config.json
{
"providers": {
"anthropic": { "apiKey": "your-anthropic-key" }
},
"agents": {
"defaults": {
"model": "anthropic/claude-opus-4-5",
"workspace": "/absolute/path/to/your/workspace",
"executionMode": "auto"
}
}
}OpenAI native models use the official openai-go SDK with default API base https://api.openai.com/v1.
Anthropic native models use the official anthropic-sdk-go SDK with default API base https://api.anthropic.com.
Set agents.defaults.executionMode:
safe: conservative exploration modeask: default modeauto: autonomous continuation (no manual "continue" approval for paused plans)
- Build:
make webui-install && make webui-build - Start:
./build/maxclaw-gateway -p 18890 - Open:
http://localhost:18890
- Architecture:
ARCHITECTURE.md - Operations:
MAINTENANCE.md - Browser runbook:
BROWSER_OPS.md - Full Chinese docs and all channel/config examples: README.zh.md
