Open-source LangGraph AI agent for building and operating n8n workflows with a skills-first architecture.
Quick Start • Architecture • Contributing • Security
Aria, powered by Aegra, helps automation builders and agent developers create, modify, and operate n8n workflows through a single assistant. It combines local skill packs, MCP tools, and n8n API access so workflow tasks are guided by domain best practices before tool execution.
Read the full story behind this project: Aria: LangGraph Agent Skills + MCP for n8n Workflows
Aria packages workflow design knowledge, node configuration details, and reliable execution tooling into one assistant so users can move faster with fewer errors.
- Skills-first workflow guidance before MCP tool calls
- Direct n8n integration through n8n-mcp
- Thin runtime overlay for Aegra config and service wiring
- Modular skills architecture under
backend/agents/ - Optional UI for local manual end-to-end testing
- Docker-based local stack for backend + n8n + UI
Aria uses a skills-first architecture built on LangGraph: before the agent calls any tool, it loads the relevant skill pack to ground its reasoning in domain best practices. The skill packs and MCP integration are based on the work of @czlonkowski (n8n-skills, n8n-mcp). The design was inspired by Anthropic's Agent Skills and follows the Agent Skills open format.
| Skill | Purpose |
|---|---|
n8n-code-javascript |
JavaScript code node expertise — built-in functions, common patterns, error handling |
n8n-code-python |
Python code node expertise — data access, standard library, error patterns |
n8n-expression-syntax |
n8n expression language — syntax reference, common mistakes, examples |
n8n-mcp-tools-expert |
MCP tools integration — search, validation, and workflow operation guides |
n8n-node-configuration |
Node setup patterns — operation patterns, dependency management |
n8n-validation-expert |
Error troubleshooting — error catalog, false positive identification |
n8n-workflow-patterns |
Workflow architecture — webhooks, scheduled tasks, HTTP APIs, AI agents, DB operations |
Skills load in three levels to keep the context window lean:
- Catalog — skill names and one-line descriptions are always in the system prompt
- Instructions — full
SKILL.mdloaded on demand viaload_skill() - Reference — supporting docs loaded individually via
read_skill_file()
New skills are auto-discovered at startup — just drop a folder with a SKILL.md into backend/agents/n8n_agent/skills/ and restart.
Run these commands from repository root:
make backend-cli-install
make n8n-up
cp backend/.env.example backend/.env
# edit backend/.env and set OPENAI_API_KEY + N8N_API_KEY
make backend-upIn a second terminal:
curl http://localhost:4242/health
cp ui/.env.example ui/.env
make ui-docker-upExpected outcomes:
n8nUI athttp://localhost:4245- backend health check returns
200 OKathttp://localhost:4242/health - optional UI at
http://localhost:4241
- Docker
- uv (Python package manager)
- A running n8n instance with API access enabled (see n8n Setup below)
Important: The agent connects to n8n via its REST API to read workflows, execute actions, and use n8n tools. Without a configured n8n instance the agent starts but cannot do useful work, and the UI will show errors on every message.
make backend-cli-installOr directly with uv:
uv tool install aegra-cli==0.7.2Verify:
aegra --version
# aegra 0.7.2make n8n-upn8n is available at http://localhost:4245
First-time setup:
- Open http://localhost:4245 and complete the owner account setup
- Go to Settings → n8n API
- Click Create an API key and copy it
cp backend/.env.example backend/.envOpen backend/.env and fill in:
| Variable | Value |
|---|---|
OPENAI_API_KEY |
Your OpenAI key (or switch to Ollama — see LLM Configuration) |
N8N_API_KEY |
The API key you created in Step 2 |
N8N_API_URL is pre-filled to http://localhost:4245/api/v1 and works with make n8n-up as-is.
make backend-upThis starts n8n-mcp via Docker and then runs aegra dev with your agents loaded. Keep this terminal open; Aegra logs stream here.
curl http://localhost:4242/healthExpected: 200 OK
cp ui/.env.example ui/.env
make ui-docker-upUI is available at http://localhost:4241
Typical prompts for Aria:
- "Create a workflow that receives webhook data and stores it in a database."
- "Review this workflow JSON and suggest reliability improvements."
- "Add retry, timeout, and error-handling patterns to my HTTP Request nodes."
- "Generate a scheduled reporting workflow and explain each node configuration."
- The UI is local development only and not production-safe.
- The UI stores conversations and app settings in browser IndexedDB (plaintext).
- Any JavaScript on the same origin can read this local data.
- See SECURITY.md for vulnerability reporting.
The n8n/ folder contains a Docker Compose file for running n8n locally:
make n8n-up # start
make n8n-down # stopData is persisted in a named Docker volume (aria_n8n_data) so your workflows survive restarts.
- Open http://localhost:4245
- Complete the initial owner account setup if needed
- Navigate to Settings → n8n API
- Click Create an API key, give it a name, and copy the key
- Paste it into
backend/.envasN8N_API_KEY
Point the agent at any reachable n8n instance by setting these in backend/.env:
N8N_API_URL=https://your-n8n.example.com/api/v1
N8N_API_KEY=your-api-key
N8N_WEB_URL=https://your-n8n.example.comEdit backend/.env and activate one of the two options:
Option A — OpenAI (cloud):
LLM_PROVIDER=openai
OPENAI_API_KEY=sk-...
OPENAI_MODEL=gpt-5-miniOption B — Ollama (local, free):
LLM_PROVIDER=ollama
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=qwen3| Port | Service |
|---|---|
4241 |
UI (Vite dev server) |
4242 |
Aegra backend |
4244 |
n8n-mcp sidecar |
4245 |
n8n |
| Path | Purpose |
|---|---|
backend/agents/ |
Agent graph, tools, prompts, and skill packs |
backend/ |
Thin runtime overlay (Aegra config + service compose) |
n8n/ |
Local n8n Docker Compose for development |
ui/ |
Optional React harness for manual end-to-end checks (stores local conversations/settings in IndexedDB) |
scripts/ |
CI and local guard/smoke scripts |
Where to start if you are:
- user/integrator: this
README.md - contributor:
CONTRIBUTING.md - maintainer:
ARCHITECTURE.mdandbackend/UPSTREAM.md
See CONTRIBUTING.md for contribution guidelines and preferred workflows.
This repository does not vendor Aegra source code. See backend/UPSTREAM.md for runtime pinning and upgrade policy.
This project is licensed under the GNU General Public License v3.0 only (GPL-3.0-only). See LICENSE.
This project builds on the work of:
- Aegra — the agent runtime framework that powers the backend.
- n8n-mcp — the MCP server that exposes n8n workflows and tools to the agent.
- n8n-skills — skill pack patterns and reference implementations the agent skills are based on.
- Lovable — the UI was scaffolded and developed using Lovable.