Skip to content

AlttiK/Minterviewer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Minterviewer - Chrome Extension

Tutor and Mock Interview Practice

Features

  • Chrome Extension

    • Clean chat-based interview interface
    • Uses random Leetcode problem from Neetcode 150
    • Display AI-generated questions with TTS audio playback
    • End-of-session feedback with strengths, weaknesses, and actionable improvements
  • FastAPI Backend

    • POST /chat - Send messages to AI interviewer
    • POST /feedback - Get end-of-session feedback
    • GET /health - Check server and Ollama status
    • Async endpoints for performance
  • AI Integration

    • Local Ollama LLM Llama 3.2
    • LeetCode-style interview conversation flow
    • Tracks candidate progress for feedback
  • Text To Speech

    • Local Kokoro-onnx TTS for audio generation
    • AI responses converted to speech

Current Development

  • Live Kit Integration
  • Microphone Input and STT
  • General Tutuoring

Setup Instructions

Prerequisites

  1. Python 3.10+

    python --version
  2. Ollama (Local LLM runtime)

  3. Kokoro-onnx TTS

    • Download a Voice Model (~300 MB):
      # adds .onnx model file to backend
      curl -L \
        https://github.com/thewh1teagle/kokoro-onnx/releases/download/model-files-v1.0/kokoro-v1.0.onnx \
        -o backend/voices/kokoro-v1.0.onnx
      
  4. Google Chrome (for extension)


Backend Setup

Step 1: Install Ollama and Pull Model

# Start Ollama service
ollama serve

# In a new terminal, pull the AI model (only once)
ollama pull llama3.2:latest

# Check downloaded models
ollama list

Step 2: Install Python Dependencies

cd backend

# Create virtual environment (recommended)
python -m venv venv
.\venv\Scripts\Activate.ps1

# Install dependencies
pip install -r requirements.txt

Step 3: Start FastAPI Server

# Make sure you're in the backend directory
cd backend

# Run the server
python app.py

Server will start at: http://localhost:8000

Check health: http://localhost:8000/health


Chrome Extension Setup

Step 1: Load Extension in Chrome

  1. Open Chrome and go to: chrome://extensions/

  2. Enable Developer mode (toggle in top-right)

  3. Click "Load unpacked"

  4. Navigate to and select the chrome-extension folder

  5. The extension should now appear in your extensions list


Usage

Starting an Interview

  1. Make sure backend is running:

    • Ollama service: ollama serve
    • FastAPI server: python app.py (in backend folder)
  2. Open the extension by clicking the icon in Chrome

    • Open the side panel to keep extension open
  3. Configure interview:

    • Interview Type: SWE Intern
    • Question Type: LeetCode-style DSA
  4. Click "Start Interview"

  5. The AI interviewer will:

    • Introduce itself
    • Open Leetcode
    • Describe the first LeetCode-style question
    • Play audio

During the Interview

  • Type your approach or solution in the text box
  • Ask for hints if you're stuck
  • Clarify the question (e.g., "What are the input constraints?")
  • The AI will keep track of all questions, and inputs (leetcode code parsing coming soon)

Ending the Interview

  1. Click "End Interview" button

  2. The AI will generate feedback with:

    • Strengths
    • Weaknesses
    • Actionable Improvement
  3. Click "Return to Home Page" to go back to interview setup


API Documentation

POST /chat

Send a message to the AI interviewer.

Request:

{
  "messages": [
    {"role": "system", "content": "System prompt"},
    {"role": "user", "content": "User message"}
  ],
  "session_id": "unique-session-id"
}

Response:

{
  "message": "AI response text",
  "audio_url": "/audio/speech_123.wav"
}

POST /feedback

Get end-of-session feedback.

Request:

{
  "messages": [...],
  "session_id": "unique-session-id"
}

Response:

{
  "feedback": {
    "strengths": "Good problem-solving approach...",
    "weaknesses": "Could improve time complexity analysis...",
    "improvement": "Practice more dynamic programming problems"
  }
}

GET /health

Check server health and Ollama connection.

Response:

{
  "status": "healthy",
  "ollama": "connected",
  "sessions": 3
}

Dependencies

Backend

  • FastAPI - Web framework
  • Uvicorn - ASGI server
  • httpx - Async HTTP client for Ollama
  • pydantic - Data validation
  • kokoro-onnx - Text-to-speech

Chrome Extension

  • No external dependencies

External Services

  • Ollama - Local LLM runtime
  • Kokoro-onnx - Local TTS

Privacy & Security

Fully Local - All data stays on your machine

  • No cloud API calls
  • No data collection
  • No internet required

Free & Open Source

  • All components are free
  • No API keys needed
  • No subscription costs

Happy interviewing!

About

A fully local AI tutor and mock interviewer Chrome extension with audio, feedback, and zero cloud dependencies.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors