Skip to content

AzeemAIDev/facial-expression-recognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🧠 Facial Expression Recognition

Facial Expression Recognition is a project that uses a trained Convolutional Neural Network (CNN) to identify human emotions from grayscale face images. It supports multiple emotion classes such as happy, sad, angry, surprised, and more — and is deployed as an API using FastAPI for easy integration.

Project Overview

Facial Expression Recognition Model can classify facial expressions from images using deep learning techniques. It is trained on image data and can recognize common human emotions. The model is wrapped inside a FastAPI backend so you can query it via HTTP requests in real time.

📁 Repository Structure

facial-expression-recognition/
├── Images/                  # Example images or screenshots
├── code/                    # Training and utility code
├── model/                   # Trained model data
├── template/                # Web or UI templates
├── .gitattributes
├── README.md
├── class_weights.json       # Class weights for model
├── config.pkl               # Configuration or other model info
├── main.py                  # FastAPI backend
├── requirements.txt         # Python dependencies

Features

  • Detects major facial emotions from images
  • Uses a trained CNN model
  • Easily deployable using FastAPI

Tech Stack

  • Python
  • TensorFlow / Keras (CNN)
  • Convolutional Neural Network
  • FastAPI (Backend API)
  • HTML / Frontend templates (if included)
  • Jupyter Notebook / Python Scripts (for training)

How It Works

  1. Input image — A face image is passed to the model.
  2. Preprocessing — The image is prepared and normalized.
  3. CNN Model — A trained CNN model predicts the emotion category.
  4. Output — The API returns the predicted emotion label.

📦 Installation & Setup

1. Clone the Repository

git clone https://github.com/AzeemAIDev/facial-expression-recognition.git
cd facial-expression-recognition

2. Install Dependencies

pip install -r requirements.txt

3. Run the FastAPI Server

uvicorn main:app --reload

4. Test the API

Visit:

http://localhost:8000/docs
Right click on index.html select open with live server

This opens the Swagger UI where you can try the API endpoints.

Testing & Usage

You can send a face image to the API and receive a predicted emotion as JSON output. Use tools like Postman or the Swagger UI to test.

Example endpoint:

POST /predict

Notes

  • The model may not work perfectly with very noisy images.

Author

Azeem
ML Engineer & AI Learner
https://github.com/AzeemAIDev

About

Trained Convolutional Neural Network (CNN) model for recognizing human facial expressions from grayscale face images, supporting multiple emotion classes such as happy, sad, angry, and surprise, deployed using FastAPI.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors