An AI-powered video editor, powered by GitHub Copilot, built as a monorepo.
Recording.2026-01-26.215303.mp4
- AI Copilot: Chat with an AI assistant to edit videos, find assets, and analyze content.
- GitHub Copilot Integration: Use GitHub Copilot as your LLM provider for advanced coding and reasoning capabilities.
- Multimodal Understanding: The AI can "see" your video frames and understand context.
- Asset Integration: Search and download stock footage/images from Pexels and Unsplash.
- Local Processing: Uses FFmpeg and Whisper locally for privacy and performance.
- Web-based Editor: Built on Next.js and Remotion (forked from Clip-js).
- Frontend: Next.js (apps/web)
- Backend: Node.js + Express + WebSocket (apps/backend)
- MCP Servers: Modular tools for FFmpeg, Whisper, Vision, Assets, etc. (mcp-servers/*)
- Node.js 20+
- pnpm 9+
- FFmpeg installed on your system (for local backend operations, though some servers use static binaries)
-
Install dependencies:
pnpm install
-
Environment Setup:
Copy
.env.exampletoapps/backend/.envandapps/web/.env.local:cp .env.example apps/backend/.env cp .env.example apps/web/.env.local
Required Environment Variables:
Backend (
apps/backend/.env):LLM_PROVIDER:anthropic,openai,gemini, orcopilotANTHROPIC_API_KEY/OPENAI_API_KEY/GEMINI_API_KEY: API key for the chosen provider.PEXELS_API_KEY: For video/image search.UNSPLASH_ACCESS_KEY: For image search.
Frontend (
apps/web/.env.local):NEXT_PUBLIC_BACKEND_URL: URL of the backend (e.g.,http://localhost:3001).
Note: For GitHub Copilot setup instructions, see docs/copilot.md.
-
Start Development:
pnpm dev
This starts:
- Frontend at http://localhost:3000
- Backend at http://localhost:3001
- All MCP servers (managed by backend)
pnpm dev: Start everythingpnpm dev:web: Start only frontendpnpm dev:backend: Start only backendpnpm build: Build all packages
For detailed documentation, please visit the docs/ directory: