Cirser is a deployable, secure, rule-grounded Electrical Engineering reasoning system with interactive simulation-level visualization. Unlike standard chatbots, Cirser does not hallucinate answers; it constructs solutions by retrieving formal engineering rules, validating them against constraints, and delegating computation to symbolic solvers.
Live Demo: https://cirser.vercel.app
The system never "guesses". It enforces a strict pipeline:
- Retrieval: Fetches formal laws (KVL, Ohm's Law) from a vector database.
- Planning: The AI acts as an orchestrator, selecting the correct rule and variables.
- Symbolic Solving: Math is delegated to SymPy, ensuring 100% algebraic precision.
A glassmorphic, "Electric Dark" interface built with React, Three.js, and Framer Motion:
- Simulation Stack: 3D Visualization of circuit nodes.
- Rule Stack: Context-aware cards that appear when rules are applied.
- Chat Stack: Logic-aware conversation interface.
- Control Stack: real-time parameter tuning (Frequency, Voltage).
- Authentication: Full JWT-based Login/Signup system.
- Protection: All API endpoints are protected (guest access revoked).
- Rate Limiting:
- Chat Endpoint: 20 req/min
- AI Proxy: 10 req/min (Protects LLM Quota)
The system is deployed using a decoupled Microservices pattern:
- Tech: React, Vite, TailwindCSS, Framer Motion, Three.js.
- Role: Handles UI, Auth State (JWT), and Visualization.
- Security: Routes are protected; unauthenticated users are redirected to Landing Page.
- Tech: FastAPI, Python 3.10, PostgreSQL (via Render), ChromaDB (Embedded).
- Role:
- API Gateway: manages Auth and Rate Limiting.
- RAG Engine: Retrieves and ranks engineering rules.
- Internal AI Proxy: Communicates with Hugging Face Inference API (Qwen-2.5-72B).
- Symbolic Engine: Solves the math via SymPy.
- Python 3.10+
- Node.js 18+
- Docker (Optional)
cd backend
pip install -r requirements.txt
# Set Environment Variables (see .env.example)
# DATABASE_URL=sqlite:///./sql_app.db
# HF_API_KEY=your_huggingface_key
uvicorn app.main:app --reloadcd frontend
npm install
npm run devdocker-compose up --build- Connect repository to Render.
- Select Blueprint or Web Service (Docker).
- Set Environment Variables:
HF_API_KEY: Your Hugging Face Token.PORT:8000.
- Connect repository to Vercel.
- Set Environment Variable:
VITE_API_URL: The URL of your rendered backend (e.g.,https://cirser.onrender.com/api/v1).
- Deploy.
- No Chat History Memory: Every request is reasoned about independently.
- Rule-Grounding: Every output cites a vetted source.
- Input Sanitization: All inputs are validated via Pydantic schemas.