Skip to content

oasis-parzival/SignBridge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🌉 SignBridge - Breaking Barriers with AI

Status Version License Privacy

SignBridge is a powerful, real-time Indian Sign Language (ISL) translator that runs entirely in your browser. By leveraging advanced computer vision and machine learning (ONNX Runtime + MediaPipe), it bridges the communication gap between the Deaf community and the rest of the world—without needing server-side processing.

🚀 Key Features

  • 👉 Real-Time ISL Translation: Instantly converts ISL gestures into text using a lightweight, locally optimized ONNX model.
  • 🗣️ Text-to-Speech: Type your message and have it spoken aloud.
  • 🌐 Multilingual Support: Accessible detailed UI in English, Hindi (हिंदी), Marathi (मराठी), and Gujarati (ગુજરાતી).
  • 🔒 Privacy First: All inference happens on your device. No video data is ever sent to a server.
  • ⚡ High Performance: Powered by WebAssembly (WASM) and SIMD instructions for smooth performance on standard devices.
  • 🤖 AI Assistant: Integrated chatbot powered by DeepSeek-V3 for getting help with the platform.

🛠️ Tech Stack

  • Frontend: React + Vite (Fast & lightweight)
  • Styling: TailwindCSS + Framer Motion (Beautiful & responsive animations)
  • AI/ML Engine:
    • MediaPipe Hands: For skeletal hand tracking.
    • ONNX Runtime Web: For running the custom ISL classification model.
  • Icons: Lucide React

📦 Installation & Setup

  1. Clone the repository

    git clone https://github.com/your-username/SignBridge.git
    cd SignBridge
  2. Install dependencies

    npm install
  3. Start the development server

    npm run dev

    Open http://localhost:5173 in your browser.

  4. Build for production

    npm run build

🧠 How It Works

  1. Detection: MediaPipe detects 21 landmarks on each hand in the video stream.
  2. Preprocessing: Coordinates are normalized (relative to the wrist) to ensure the model understands gestures regardless of camera distance.
  3. Inference: The normalized data is fed into a quantized Random Forest model (via ONNX) which predicts one of 42 ISL signs.
  4. Result: The prediction is displayed instantly with a confidence score.

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.


Built with ❤️ for a more inclusive future.

About

Real-time Indian Sign Language (ISL) translator running entirely in-browser. Powered by Edge AI (MediaPipe + ONNX) for privacy-first, zero-latency inference. Features Multilingual UI, TTS, and Gemini integration.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors