Updated: 2026-04-14
- SonicHand is a final project by Yueqiao Zhang for MUMT306 in McGill University.
This project converts:
- Hand gestures from a webcam (MediaPipe)
- Distance data from Arduino ultrasonic sensor
into real-time OSC messages for PlugData/Pure Data synthesis control.
0 = Open hand 1 = Closed fist 2 = Thumb-only 3 = Thumb + index 4 = Thumb + index + middle
- Download
hand_landmarker.taskfrom the link inhand_landmarker_download
pip install -r requirements.txt
python main.py
- MediaPipe: Hand gesture detection library by Google
- python-osc: OSC protocol implementation
- PySerial: Arduino communication
- OpenCV: Computer vision processing
- Youtube tutorial for synthesizer:
- Generative AI including Github Copilot only used for code, comments, and document refining. NOT used in development.
- Add machine learning model for custom gesture recognition training
- Expand to multi-hand tracking for collaborative musical control
- Optimize latency for real-time performance applications
- Add calibration UI for sensor distance thresholds
- Support multiple Arduino sensors for spatial audio control
- Support multiple musical scales instead of the chromatic scale only for better aesthetic purposes.
Check
/docfor detailed project write-up.