Launch your AI‑powered product in minutes instead of weeks. This repo packages a modern FastAPI + Next.js stack with LangChain integration and a Frappe connector so you can skip the plumbing and start innovating. Our framework saves you over 50 hours of setup work and lets you ride the AI agent wave.
2026 is the year of local AI agents. Projects like LangGraph and OpenClaw prove the future belongs to self‑hosted, autonomous tools. But wiring together back‑end services, front‑end UIs and LLM workflows is tedious. This boilerplate abstracts away the infrastructure so you can focus on building products that delight customers.
- FastAPI backend with LangChain example endpoints for text summarisation and a ready‑to‑use Frappe data connector.
- Next.js + Tailwind front‑end so your UI looks polished from day one.
- Dockerised services for one‑command startup (
docker‑compose up). - Authentication & billing stubs you can extend with your own Stripe keys.
- Local model support via Ollama, plus hooks for LangFlow or LangGraph agents.
- Clone this repository and copy
.env.exampleto.env. - Run
docker‑compose up -dto spin up all containers. - Visit
http://localhost:3000to use the demo summarisation form and start building your own features.
This project is part of a commercial template package. For licensing and support, see the release notes.