An AI-powered tutoring assistant that analyzes your whiteboard or notebook in real time, answers questions, and helps you study through voice, chat, practice tests, summaries, and flashcards.
Built by Manraaj Singh & Sukhraj Sandhar
No setup needed — the app is fully deployed.
https://mentora-tutor.netlify.app/
Open in Chrome, allow camera and microphone access, and start a session.
- Live whiteboard analysis — point your camera at notes or equations and get instant explanations
- AI chat — subject-aware tutoring across Math, Physics, Chemistry, Biology, CS, History, Literature, and Economics
- Voice mode — real-time voice conversation with Gemini Live API
- Practice tests — generate tests from your session, typed notes, or uploaded files
- Summaries — get a structured summary of any tutoring session
- Flashcards — auto-generate flashcards from your session
- File attachment — attach PDFs or images directly in chat
- Session history — all chats saved to Firestore, accessible across sessions
- Subject-aware personas — each subject has its own tutor personality
- Adaptive complexity — automatically adjusts explanation depth based on your level
Frontend
- Vanilla JS (ES modules)
- Firebase Firestore (client SDK) — session storage
- Hosted on Firebase Hosting
Backend
- Node.js + Express
- Google Vertex AI — all AI endpoints (chat, analyze, practice test, summarize, flashcards)
- Gemini Live API — real-time voice WebSocket proxy
- Google Secret Manager — stores the Live API key securely
- Deployed on Google Cloud Run
Mentora/
├── backend/
│ ├── js/
│ │ ├── server.js # Main Express server + WebSocket proxy
│ │ ├── prompts.js # All AI prompts
│ │ ├── personas.js # Subject tutor personas
│ │ └── complexity.js # Complexity detection logic
│ ├── Dockerfile
│ └── package.json
├── frontend/
│ ├── index.html
│ ├── js/
│ │ ├── app.js # App entry point
│ │ ├── config.js # Backend URL — change this for self-hosted
│ │ ├── firebase.js # Firestore config — change this for self-hosted
│ │ ├── messages.js # Chat message handling
│ │ ├── history.js # Session history sidebar
│ │ ├── voice.js # Voice/Live API
│ │ ├── camera.js # Webcam capture
│ │ ├── practice-test.js # Practice test UI
│ │ ├── summary.js # Session summary
│ │ ├── flashcards.js # Flashcard UI
│ │ ├── fileAttach.js # File attachment
│ │ ├── export.js # Export session
│ │ └── state.js # Global state
│ └── css/
├── firebase.json
├── firestore.rules
└── .firebaserc
If you want to run your own instance against your own GCP project, follow these steps.
- Node.js 20+
- Google Cloud SDK
- Firebase CLI —
npm install -g firebase-tools - A Google Cloud project with these APIs enabled:
- Vertex AI
- Cloud Run
- Secret Manager
- Firestore
gcloud config set project YOUR_PROJECT_ID
gcloud services enable \
aiplatform.googleapis.com \
run.googleapis.com \
secretmanager.googleapis.com \
firestore.googleapis.com- Go to Firebase Console and create a project with the same project ID
- Go to Firestore Database → Create database → Start in production mode
- Go to Project Settings → Your apps → Add a web app → copy the config object
Get a key from aistudio.google.com, then:
echo -n "YOUR_GEMINI_API_KEY" | gcloud secrets create GEMINI_API_KEY \
--data-file=- \
--project=YOUR_PROJECT_IDcd backend
gcloud run deploy mentora-backend \
--source . \
--region us-central1 \
--platform managed \
--allow-unauthenticated \
--set-env-vars GCP_PROJECT=YOUR_PROJECT_IDCopy the service URL from the output, then grant Secret Manager access to the Cloud Run service account:
gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \
--member="serviceAccount:YOUR_PROJECT_NUMBER-compute@developer.gserviceaccount.com" \
--role="roles/secretmanager.secretAccessor"To find your project number: gcloud projects describe YOUR_PROJECT_ID --format="value(projectNumber)"
In frontend/js/config.js, replace the URL with your Cloud Run service URL:
export const API_BASE = 'https://mentora-backend-XXXXXXXXXX.us-central1.run.app';
export const WS_BASE = 'wss://mentora-backend-XXXXXXXXXX.us-central1.run.app';In frontend/js/firebase.js, replace the config object with the one from Step 2:
const firebaseConfig = {
apiKey: "YOUR_API_KEY",
authDomain: "YOUR_PROJECT_ID.firebaseapp.com",
projectId: "YOUR_PROJECT_ID",
storageBucket: "YOUR_PROJECT_ID.firebasestorage.app",
messagingSenderId: "YOUR_SENDER_ID",
appId: "YOUR_APP_ID",
};firebase login
firebase use YOUR_PROJECT_ID
firebase deploy --only hosting,firestoreYour app will be live at https://YOUR_PROJECT_ID.web.app.
-
Authenticate with GCP:
gcloud auth application-default login
-
Create
backend/.env:GCP_PROJECT=YOUR_PROJECT_ID -
Install and start the backend:
cd backend npm install npm run dev -
Point the frontend at localhost — in
frontend/js/config.js:export const API_BASE = 'http://localhost:3001'; export const WS_BASE = 'ws://localhost:3001';
-
Open
frontend/index.htmlwith a local server (e.g. VS Code Live Server ornpx serve frontend).
Voice mode requires Secret Manager access on the GCP project. All other features work with just
gcloud auth application-default login.
| Variable | Where | Description |
|---|---|---|
GCP_PROJECT |
backend/.env / Cloud Run |
Your Google Cloud project ID |
GEMINI_API_KEY |
Secret Manager | Gemini API key for Live voice — stored as a secret, not in .env |
The backend uses Application Default Credentials (ADC) for Vertex AI — no API key needed for regular AI endpoints.
MIT