Skip to content

A voice AI agent built using Layercode, twilio, nextjs, and cloudflare workers

License

Notifications You must be signed in to change notification settings

anupj/voice-ai-agent

Repository files navigation

Crisis Support Voice AI

A real-time voice AI application providing 24/7 crisis support with emergency services integration.

What It Does

  • Natural voice conversations with an AI crisis counselor
  • Real-time speech-to-text and text-to-speech via Layercode
  • Crisis intervention tools: grounding exercises, breathing exercises, safety assessment
  • Emergency services: Find and get directions to nearest hospital or police station using OpenStreetMap
  • Browser-based: No app installation required
  • Sub-second latency: Natural, responsive conversations

Tech Stack

  • Frontend: Next.js 15, React 19, Layercode React SDK
  • Voice Processing: Layercode (Deepgram STT, Rime TTS)
  • Telephony Twilio
  • AI: OpenAI GPT-4o with function calling
  • Maps: OpenStreetMap (Nominatim, Overpass, OSRM)
  • Deployment: Cloudflare Workers

Architecture

Voice is handled by Layercode's edge infrastructure. Our backend receives transcribed text via webhook, processes with OpenAI, and streams responses back via Server-Sent Events (SSE).

User Browser ←→ Layercode Cloud ←→ Your Backend (Next.js)
   (Audio)        (STT/TTS)          (AI Logic)

Prerequisites

  • Node.js 18+ and npm
  • OpenAI API key
  • Layercode account (free at dash.layercode.com)
  • cloudflared for local development (optional but recommended)

Quick Start

1. Clone and Install

git clone <your-repo-url>
cd fullstack-nextjs-cloudflare
npm install

2. Set Up Environment Variables

Copy .env.example to .env:

cp .env.example .env

Fill in your environment variables:

# Required
NEXT_PUBLIC_LAYERCODE_AGENT_ID=your_agent_id_here
LAYERCODE_API_KEY=your_api_key_here
LAYERCODE_WEBHOOK_SECRET=your_webhook_secret_here
OPENAI_API_KEY=your_openai_key_here

# Optional
NEXTJS_ENV=development

Where to get these values:

  1. Go to dash.layercode.com
  2. Create or select an agent
  3. Copy the Agent ID from the dashboard
  4. Click Connect Your Backend to get your Webhook Secret
  5. Go to Settings to get your Layercode API Key
  6. Get your OpenAI API Key from platform.openai.com

3. Start Development Server

npm run dev

Your app will be running at http://localhost:3000

4. Expose Your Webhook (Required for Voice to Work)

Layercode needs to send webhooks to your backend. In development, you need to expose localhost using a tunnel:

Option A: Using Layercode CLI (Recommended)

npx @layercode/cli tunnel --port=3000 --path=/api/agent

This will:

  • Create a public tunnel to your localhost
  • Automatically update your agent's webhook URL in the Layercode dashboard
  • Show logs of incoming webhooks

Option B: Using cloudflared

# Install cloudflared (macOS)
brew install cloudflared

# Start tunnel
cloudflared tunnel --url http://localhost:3000

Copy the public URL (e.g., https://your-tunnel.trycloudflare.com) and:

  1. Go to Layercode dashboard
  2. Click Connect Your Backend
  3. Set Webhook URL to: https://your-tunnel.trycloudflare.com/api/agent
  4. Save

Note: The cloudflared URL changes each time you restart the tunnel.

5. Test Your Voice Agent

  1. Open http://localhost:3000
  2. Click Connect
  3. Allow microphone access
  4. Start speaking!

Try saying:

  • "I'm feeling anxious"
  • "Can you help me find a hospital?" (then provide your location)
  • "I need directions to the nearest police station"

Project Structure

fullstack-nextjs-cloudflare/
├── app/
│   ├── api/
│   │   ├── agent/route.ts          # Main webhook handler (AI logic)
│   │   └── authorize/route.ts       # Session authorization
│   ├── ui/
│   │   ├── VoiceAgent.tsx          # Main voice interface
│   │   ├── MicrophoneButton.tsx    # Mic controls
│   │   ├── TranscriptConsole.tsx   # Shows conversation
│   │   └── ...                     # Other UI components
│   ├── utils/
│   │   ├── emergency-services.ts   # OpenStreetMap integration
│   │   ├── updateMessages.ts       # Transcript management
│   │   └── ...
│   ├── page.tsx                    # Main entry point
│   └── layout.tsx                  # App layout
├── layercode.config.json           # System prompt & welcome message
├── package.json
├── next.config.ts
└── README.md

Key Files to Know

API Routes

  • app/api/agent/route.ts: Main webhook endpoint

    • Receives transcribed user speech
    • Processes with OpenAI GPT-4
    • Implements crisis support tools
    • Streams responses back via SSE
  • app/api/authorize/route.ts: Session authorization

    • Called by frontend to start a session
    • Proxies request to Layercode API
    • Returns session key for WebSocket connection

Configuration

  • layercode.config.json: Contains system prompt and welcome message
    • Define AI personality and behavior
    • Crisis counselor guidelines
    • Tool usage instructions

Emergency Services

  • app/utils/emergency-services.ts: OpenStreetMap integration
    • Geocoding (location → coordinates)
    • Finding nearest hospitals/police
    • Calculating walking directions
    • Formatting verbal instructions

Available Scripts

# Development
npm run dev              # Start dev server with Turbopack

# Building
npm run build            # Build for production
npm run start            # Start production server

# Deployment
npm run deploy           # Deploy to Cloudflare Workers
npm run preview          # Preview Cloudflare deployment locally

# Linting
npm run lint             # Run ESLint

Customizing the AI

Change System Prompt

Edit layercode.config.json:

{
  "prompt": "Your custom system prompt here...",
  "welcome_message": "Hi, how can I help you today?"
}

Add New Tools

Edit app/api/agent/route.ts:

const myNewTool = tool({
  description: 'When to use this tool',
  inputSchema: z.object({
    param: z.string().describe('Parameter description')
  }),
  execute: async ({ param }) => {
    // Your tool logic here
    return {
      success: true,
      message: 'Response to tell the user'
    };
  }
});

// Add to tools object in streamText()
tools: {
  five_four_three_two_one_grounding,
  guide_breathing_exercise,
  assess_safety,
  find_emergency_service,
  myNewTool  // Add your new tool
}

Built-in Crisis Support Tools

1. Grounding Exercises

  • 5-4-3-2-1 Technique: Sensory grounding for anxiety/panic
  • Breathing Exercise: Guided 4-4-4 breathing

2. Safety Assessment

  • Detects suicidal ideation or self-harm risk
  • Provides crisis resources (999, Samaritans: 116 123)
  • Escalation protocols

3. Emergency Services (NEW!)

  • Uses free OpenStreetMap APIs
  • Finds nearest hospital or police station (10km radius)
  • Provides verbal walking directions
  • No API keys required

Deployment

Deploy to Cloudflare Workers

npm run deploy

After deployment:

  1. Copy your deployed URL
  2. Update webhook URL in Layercode dashboard
  3. Set environment variables in Cloudflare dashboard

Deploy to Vercel

# Push to GitHub
git push origin main

# In Vercel dashboard:
# 1. Import your GitHub repo
# 2. Add environment variables
# 3. Deploy

Important for Vercel: Disable "Vercel Authentication" in project settings to allow Layercode webhooks.

Troubleshooting

Voice agent not responding?

  1. Check tunnel is running and webhook URL is correct in dashboard
  2. Check webhook logs in Layercode dashboard
  3. Verify environment variables are set
  4. Check browser console for errors

Getting 401 errors?

  • Verify LAYERCODE_WEBHOOK_SECRET matches the one in the dashboard
  • Check webhook signature verification in /api/agent/route.ts

No audio?

  • Check microphone permissions in browser
  • Try a different browser (Chrome/Edge work best)
  • Check browser console for WebRTC errors

Emergency services not working?

  • Ensure OpenStreetMap APIs are accessible (not blocked by firewall)
  • Try more specific locations (postcodes work best)
  • Check console logs for API error messages

Development Tips

Viewing Conversation History

The conversation is stored in memory:

// In app/api/agent/route.ts
console.log('--- final message history ---');
prettyPrintMsgs(conversations[conversation_id]);

Check your terminal to see the full conversation history with turn IDs.

Testing Emergency Services Locally

// In app/utils/emergency-services.ts
const location = await geocodeLocation("Manchester Piccadilly");
const hospital = await findNearestEmergencyService(
  location.lat,
  location.lon,
  'hospital'
);

Webhook Logs

View webhook requests/responses in Layercode dashboard:

  • Go to your agent
  • Click "Webhook Logs" tab
  • See all requests, responses, and errors

Environment Variables Reference

Variable Required Description
NEXT_PUBLIC_LAYERCODE_AGENT_ID Your Layercode agent ID (visible in browser)
LAYERCODE_API_KEY API key for backend requests
LAYERCODE_WEBHOOK_SECRET Secret for webhook signature verification
OPENAI_API_KEY OpenAI API key for GPT-4
NEXTJS_ENV Set to 'development' for local dev

Contributing

  1. Create a feature branch
  2. Make your changes
  3. Test locally with tunnel
  4. Submit a pull request

Resources

License

[Your License Here]

Support

For issues related to:


Built with ❤️ for crisis support accessibility

About

A voice AI agent built using Layercode, twilio, nextjs, and cloudflare workers

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published