After the University of Florida won the 2025 college basketball championship, Gainesville went wild. Students filled the streets, and my roommates and I couldn't stop rewatching the game. That energy led to this project — a computer vision system that analyzes basketball footage from real games.
This tool detects players and the ball, tracks possession, assigns teams based on jersey color, and even maps everything to a tactical top-down view of the court. It started as a way to study key plays from the UF win and turned into a full analysis pipeline.
- 🧍 Detects players and the ball using YOLOv5/YOLOv8
- 🎯 Tracks movement to calculate distance and speed for each player
- 🧢 Assigns teams using a zero-shot image classifier based on jersey color
- 🏀 Logs passes and interceptions by detecting possession changes
- 📍 Detects court keypoints to understand player positioning
- 📐 Transforms perspective to generate a tactical top-down view
- 🎥 Outputs an annotated game video showing all tracked events and stats
- Python
- YOLOv5 / YOLOv8 – player, ball, and keypoint detection
- Roboflow – dataset hosting and augmentation
- OpenCV + NumPy – video frame processing and geometry
- Hugging Face Transformers – zero-shot team classification
- Matplotlib – drawing overlays and visualizations
- Docker (optional) – containerized execution
This project uses several pretrained and custom-trained models:
Additional resources:
Here are a few clips showing the system in action: