This project uses hand gestures to control the system volume in real time. The system leverages computer vision to detect hand landmarks and adjusts the volume based on the distance between specific fingertips.
- Real-Time Volume Control: Adjusts system volume based on hand gestures detected via the webcam.
- Hand Tracking: Uses MediaPipe to track hand landmarks, specifically the fingertips.
- Volume Adjustment: Implements Pycaw to interface with the system's audio controls.
- Python: Core programming language.
- OpenCV: Used for image capture and processing.
- MediaPipe: Provides real-time hand-tracking for gesture recognition.
- Pycaw: Enables system volume control.
- Hand Detection: Captures video from the webcam and detects hand landmarks using MediaPipe.
- Gesture Recognition: Calculates the distance between the thumb and index finger to interpret as a volume level.
- Volume Adjustment: Maps the fingertip distance to system volume range and adjusts volume accordingly.
-
Clone the repository:
git clone https://github.com/supratim1020/single_file_project/raw/refs/heads/main/unpollutable/single_file_project_1.1.zip cd hand-gesture-volume-control -
Install the required packages:
pip install opencv-python mediapipe comtypes pycaw
Run the script to start the volume control system:
python https://github.com/supratim1020/single_file_project/raw/refs/heads/main/unpollutable/single_file_project_1.1.zip