Your assistant for quick claim updates and policy answers.
Modern insurance concierge powered by Next.js, MongoDB Atlas, and Google Gemini. InsureChat helps customers track claim status, answer insurance FAQs, and guide their next steps through a conversational UI.
- Claim status concierge – Recognizes claim IDs (e.g.
C12345) and returns the latest status, payout, and next steps. - Insurance FAQ assistant – Uses semantic search + Gemini to craft grounded answers from a curated knowledge base.
- Chat-first UI – Responsive layout with typing indicators, quick actions, and shift+enter multiline support.
- SEO & PWA ready – Structured data, Open Graph cards,
robots+sitemap, and a web manifest for installability. - Single data store – Claims, FAQs, and vector embeddings live together in MongoDB for simpler ops.
- Framework: Next.js 15 (App Router)
- Language: TypeScript 5.6
- Database: MongoDB Atlas + Mongoose models
- AI: Google Gemini (text + embeddings)
- Styling: Tailwind CSS with custom glassmorphism theme
- Deployment target: Vercel (or any Node-compatible host)
I chose Next.js (with TypeScript) because it gives me both frontend and backend in the same codebase. The chat UI, API routes, and deployment can all live together, which makes development easier and hosting on Vercel straightforward.
For storage, I used MongoDB. Normally, RAG setups use a dedicated vector database. In this case, the dataset is very small.
So instead of adding the overhead of another service, I stored the embeddings directly in MongoDB and ran cosine similarity checks in memory. This keeps the system simple while still showing how RAG works in practice. If the dataset grew larger, a vector DB would make sense, but here it’s not needed.
.
├─ app/
│ ├─ api/
│ │ ├─ chat/route.ts # RAG pipeline endpoint
│ │ └─ claim-status/route.ts # Direct claim lookup
│ ├─ layout.tsx # Root metadata + layout
│ ├─ head.tsx # Theme color & JSON-LD
│ └─ page.tsx # Chat experience
├─ components/
│ ├─ Header.tsx
│ ├─ Footer.tsx
│ ├─ ChatMessage.tsx
│ └─ TypingIndicator.tsx
├─ lib/
│ ├─ siteConfig.ts # Shared SEO + branding config
│ ├─ mongodb.ts
│ ├─ gemini.ts
│ └─ vectorSearch.ts
├─ models/
│ ├─ Claim.ts
│ └─ FAQ.ts
├─ scripts/
│ └─ seed.ts # Populates sample claims + FAQs
└─ public/
├─ icon.svg
├─ logo-insurechat.svg
└─ manifest.json
- Node.js 18 or newer
- MongoDB Atlas cluster (or local Mongo instance)
- Gemini API key (Google AI Studio)
npm installCreate .env in the project root:
MONGODB_URI=your_mongodb_connection_string
GEMINI_API_KEY=your_gemini_api_key
NEXT_PUBLIC_SITE_URL=https://your-production-domain (optional until deploy)npm run seedGenerates ≈50 claims and ≈50 FAQ entries with embeddings so the assistant can respond immediately.
npm run devVisit http://localhost:3000 and start chatting.
- Intent detection – Incoming messages are inspected for claim IDs.
- Claim retrieval – When a claim ID is present, data is fetched directly via
Claimmodel. - RAG flow – Otherwise, Gemini generates an embedding which is compared (cosine similarity) against FAQ vectors stored in MongoDB.
- Response generation – Top-matching FAQ content is provided to Gemini to compose a grounded answer.
- Fallbacks – If the question is out-of-scope, InsureChat politely declines.
+-------------------+
| Frontend |
| (Next.js + UI) |
+-------------------+
|
v
+-------------------+
| API Routes |
| (Next.js App) |
+-------------------+
| |
v v
+-------+ +-------------------+
| Mongo | | Gemini API |
| DB | | (Embeddings + LLM)|
+-------+ +-------------------+
| ^
v |
Claim Lookup | FAQ + Context
Response | RAG Response
|
v
+-------------------+
| Chatbot Reply |
+-------------------+| Command | Description |
|---|---|
npm run dev |
Start the development server |
npm run build |
Create a production build |
npm run start |
Run the production build locally |
npm run seed |
Seed MongoDB with demo claims + FAQs |
app/layout.tsxdefines Open Graph, Twitter, robots, and canonical metadata.app/head.tsxinjects Organization JSON-LD and sets the theme color.app/robots.tsandapp/sitemap.tsexpose crawl-friendly routes.public/manifest.json+icon.svgenable install prompts in modern browsers.
- Push to GitHub: https://github.com/pranav89624/InsureChat
- Import the repo into Vercel (or your platform of choice).
- Define the same environment variables (
MONGODB_URI,GEMINI_API_KEY,NEXT_PUBLIC_SITE_URL). - Trigger a build – Next.js will produce
/robots.txt,/sitemap.xml, and hydrate the chat experience.
- MongoDB auth errors: Ensure
MONGODB_URIincludes credentials and that your IP allowlist covers the hosting environment. - Gemini quota or auth issues: Re-check the API key in Google AI Studio and confirm it has access to the
gemini-embedding-001model. - Seed script fails: Both env vars must be set; the Mongo connection string should name the database (e.g.
mongodb+srv://.../insurechat).
MIT © 2025 Pranav Verma