This project demonstrates a complete pipeline for emotion detection using deep learning.
It consists of two main parts:
-
Training & Model Preparation
- A Google Colab Notebook (
Emotion_Detection.ipynb) that explains the full workflow:
data preprocessing, model training, evaluation, and ONNX export.
- A Google Colab Notebook (
-
Web Application
- A simple web-based interface (
/web-appfolder) built with ONNX Web Runtime. - Users can upload a headshot photo and receive predicted emotions in real time.
- A simple web-based interface (
Together, these components form an end-to-end solution:
from training your own emotion detection model → to deploying it in the browser.
The Colab notebook walks through the end-to-end process of building and exporting an emotion detection model.
- Install required libraries (
torch,torchvision,onnxruntime,scikit-learn, etc.). - Configure Kaggle API for dataset downloads.
- Download FER-2013 and AffectNet datasets from Kaggle.
- Extract archives and organize into training/testing directories.
- Map fine-grained emotion labels (angry, sad, happy, etc.) → into 3 broad categories:
negative,neutral,positive.
- Define consistent class indices for training.
- Custom PyTorch
Datasetclass for flexible folder-based loading. - Preprocessing pipeline:
- Convert grayscale → RGB (3 channels).
- Resize to 224×224.
- Normalize using ImageNet mean/std.
- Combine FER-2013 + AffectNet into a unified dataset.
- Split into train / validation / test sets.
- Load ResNet18 pretrained on ImageNet.
- Freeze most layers, fine-tune only:
- Last convolutional block.
- Final fully-connected (FC) layer → replaced with 3 outputs.
- Loss: CrossEntropy.
- Optimizer: Adam with LR scheduler.
- Features:
- Mixed-precision training (
GradScaler). - Early stopping (patience = 15).
- Checkpoint saving + best-model export.
- Mixed-precision training (
- Classification metrics with
sklearn: accuracy, confusion matrix, per-class scores. - Visualization:
- Raw confusion matrix.
- Normalized confusion matrix.
- Upload a headshot image.
- Apply the same preprocessing pipeline.
- Predict emotion label using the trained model.
- Export trained PyTorch model → ONNX format.
- Define input/output node names (
input,output). - Save
Emotion_Model.onnxfor deployment. - Verify model input/output shapes in Colab.
The Web-App allows users to upload a headshot image and detect the emotion in real-time using the ONNX model exported from the notebook.
- Upload and preview an image directly in the browser.
- Preprocess images to match training input size (224×224) and normalization.
- Run inference locally in the browser using ONNX Runtime Web (
onnxruntime-web). - Display predicted emotion:
negative,neutral, orpositive. - Show probabilities for each class in a readable percentage format.
- Colored result display based on predicted emotion.
-
index.html- File input for uploading images (hidden input styled via label).
- Preview area for the selected image.
- Button to run inference.
- Output area to display predicted emotion and class probabilities.
-
app.js- Handles image preprocessing: resizing, converting to tensor, normalization.
- Loads ONNX model using
ort.InferenceSession. - Runs inference and extracts output probabilities.
- Updates DOM with predicted label and percentage probabilities.
-
style.css- Styles buttons, preview image, and results.
- Provides responsive layout and colored indicators for predicted emotion.
- User clicks Upload button → selects an image.
- Image is previewed in the browser.
- Click Run →
app.jsconverts image → ONNX-compatible tensor. - ONNX Runtime Web evaluates the model → returns predictions.
- The app displays:
- Predicted emotion label (highlighted with color)
- Class probabilities as percentages (
negative: 82%,neutral: 10%,positive: 8%)
- Client-side inference: No server required. Runs entirely in the browser.
- Lightweight: Works with a small ResNet18 ONNX model.
- User-friendly: Intuitive interface for end users to analyze headshot images instantly.