Skip to content

Feat/py cv dashboard#6

Merged
Sa1koro merged 18 commits intomainfrom
feat/py-cv-dashboard
Mar 23, 2026
Merged

Feat/py cv dashboard#6
Sa1koro merged 18 commits intomainfrom
feat/py-cv-dashboard

Conversation

@Sa1koro
Copy link
Copy Markdown
Member

@Sa1koro Sa1koro commented Mar 23, 2026

This pull request introduces major improvements to the Kait v2 Python host and ESP32 firmware ecosystem, focusing on emotion-driven control, robust documentation, and CI automation. The main highlights are the integration of emotion perception and reactor pipelines into the Python backend, detailed quickstart and hardware documentation for Kait nodes, and the addition of a Python CI workflow. These updates streamline setup, testing, and usage for both developers and end users.

Backend and Emotion Pipeline Integration:

  • Merged the python_host_emo emotion capability into python_host, introducing a perception + reactor pipeline for mapping human emotion (via camera/ViT) to flower states and device commands. Added robust API endpoints, multi-device emotion routing, and a real-time UI status block for live emotion state.
  • Added ViT-based emotion detection, an EmotionReactor with configurable thresholds/gains, and deterministic mapping to Kait, Sue, and Sylvie node commands. Multi-target dispatch and UI control for emotion scheduling are now supported.
  • Updated tests and dependencies to cover new APIs and ML features.

Documentation and Quickstart:

  • Added KAIT_V2_QUICKSTART.md: a comprehensive, user-friendly quickstart guide in Chinese, covering installation, WiFi/firmware setup, hardware wiring, command reference, troubleshooting, and creative usage scenarios for Kait v2.
  • Added/updated esp32_firmware/esp32_kait/KAIT_V2_GUIDE.md and esp32_firmware/esp32_kait/QUICK_REFERENCE.md for in-depth hardware, protocol, and command documentation, including pinouts, OSC/serial usage, and custom sequence extension. [1] [2]

Continuous Integration and Code Quality:

  • Introduced .github/workflows/python-ci.yml to automate Python tests (core and ML), linting, and multi-version compatibility checks for python_host.

Firmware and Hardware Reference:

  • Added a minimal example in esp32_kait.ino demonstrating PWM motor control with kick-start logic for N20 motors.
  • Clarified hardware pinout for Kait and Sylvie nodes, and updated terminology for consistency.

Most Important Changes:

Emotion pipeline and backend integration

  • Merged emotion perception and reactor logic into python_host, supporting real-time emotion-to-command mapping, multi-device routing, and a live UI status block.
  • Added ViT-based emotion detection, a configurable EmotionReactor, and deterministic mapping to Kait/Sue/Sylvie node commands.
  • Updated and extended tests and ML dependencies to support new features.

Documentation and onboarding

  • Added KAIT_V2_QUICKSTART.md for step-by-step setup, command reference, and troubleshooting in Chinese, targeting new users and demo scenarios.
  • Expanded and clarified hardware and protocol documentation in KAIT_V2_GUIDE.md and QUICK_REFERENCE.md, including detailed pinouts, OSC/serial commands, and custom sequence instructions. [1] [2]

Developer experience and CI

  • Introduced a Python CI workflow to automate testing and linting for python_host, ensuring code quality and compatibility across Python versions.

Firmware and hardware reference

  • Added a simple PWM motor control example for ESP32 Kait node firmware, demonstrating kick-start logic for reliable low-speed operation.
  • Updated hardware guides for Kait and Sylvie nodes to clarify pin assignments and terminology.

Copilot AI and others added 18 commits March 4, 2026 12:30
…rol panel, vision tracking, ML perception, CI/CD

- Step 1: Upgrade sylvie_main.ino from digitalWrite to ledcAttach/ledcWrite PWM (0-255)
- Step 2: Extend routeMotor1/routeMotor2 to accept dir + speed OSC parameters
- Step 3: Add test_osc_motor.py minimal test script
- Step 4: Weighted multi-face tracking with multi-camera support
- Step 5: Flask control panel with video, sliders, 2D XY pad, Override, Tag & Save
- Step 6: MediaPipe/DeepFace perception module with lazy loading
- Step 7: GitHub Actions CI/CD workflow for Python tests

Co-authored-by: Sa1koro <13943286+Sa1koro@users.noreply.github.com>
- Replace deprecated substr() with substring() in HTML template
- Replace alert() with inline toast notification for better UX
- Remove duplicate cv2 imports in perception.py
- Store target info separately instead of accessing private _address/_port
- Use proper FaceTracker re-instantiation instead of __init__ call
- Add permissions: contents: read to CI workflow

Co-authored-by: Sa1koro <13943286+Sa1koro@users.noreply.github.com>
…iguration

- Move ESP32 firmware files to new refactored directory structure
- Update Flask server port from 5000 to 15000 in main.py
- Rename Chinese component label from "舵机" to English "Servo" in hardware guide
- Add Kait node section to hardware documentation
- Create pyproject.toml with project metadata and pyserial dependency
- Generate uv.lock file with package dependencies
- Maintain all Python test scripts and control utilities in new structure
- Implement mDNS and gateway-based device discovery mechanisms
- Add device registry with node type inference and labeling
- Create device selection and management API endpoints
- Integrate threading support for concurrent device operations
- Add raw OSC console with history logging capabilities
- Implement dynamic UI controls based on detected node types
- Replace static target configuration with dynamic device management
- Add UTF-8 encoding for training sample file operations
- Create device registry JSON configuration file
- Update frontend UI with device scanning and selection controls
- Introduced threading-based camera lifecycle state management.
- Added API endpoints for camera control: start, stop, switch, and state.
- Implemented camera index handling and switching logic.
- Updated video feed and face detection routes to check camera state.
- Enhanced `/api/faces` endpoint to include camera running state.
- Introduced complete project files for Kait Node v2 including firmware, Python debug tools, and comprehensive documentation.
- Added `requirements-kait.txt` listing project dependencies.
- Created KAIT_INDEX.md as a centralized reference for all files and usage instructions.
- Included KAIT_V2_DELIVERY_REPORT.md as a detailed summary of project deliverables, features, and improvements.
- Developed quick reference and troubleshooting guide in QUICK_REFERENCE.md.
- Implemented `kait_motion_visualization.py` for motion mode visualization and parameter exploration.
- Added Python scripts for OSC debugging, serial debugging, and motion visualization tools.
- Created `API_REFERENCE.md`, detailing OSC commands, motion modes, sequences, and firmware APIs.
- Added `DELIVERY_CHECKLIST.md`, providing a thorough package validation and content summary.
- Completed English translation for all documentation and scripts.
- Updated `kait_osc_debug_en.py` with sequence functions and an interactive mode.
- Implemented motion sequence recorder with label-based organization.
- Added `/api/sequences` endpoints for saving, loading, and listing sequences.
- Extended Flask app tests to cover sequence API.
- Updated frontend with sequence recording, playback features, and status tracking.
- Enhanced UI accessibility with `aria-label` attributes.
- Adjusted camera selection route to use specific backend for macOS compatibility.
- Refined Kait node UI with motion modes and motor speed controls.
…munication support

- Implemented `CoordinatePublisher` for face-tracking data via OSC or USB serial.
- Added `SerialCoordinateSender` with pyserial integration and serial port handling.
- Introduced `/api/tracking/config` and `/api/serial/ports` endpoints for coordinate tracking and serial communication.
- Updated UI with tracking transport switch and serial connection status.
- Added Arduino-based ESP32 firmware for servo-based face tracking via OSC or serial.
- Introduced `/api/discovery/*` compatibility routes for mDNS, gateway, and auto discovery modes.
- Refactored device scan logic into a reusable `_scan_and_register_devices` function.
- Added UI support for manual device addition via IP and port.
- Implemented fallback CSS for offline usage when Tailwind CDN is unavailable.
- Updated tests to cover new discovery APIs and scenarios.
…UI controls

- Implemented lazy loading for pyserial modules to optimize app startup time.
- Added `/api/serial/raw` endpoint for sending raw serial commands and handling errors.
- Updated `/api/serial/ports` endpoint to support optional scanning with `scan` query parameter.
- Enhanced UI with serial debug command field and improved manual motor and drive pad controls.
- Updated tests to cover new API endpoints and serial behaviors.
…d Sylvie motor functions

- Simplified camera list population logic and improved error handling.
- Removed deprecated Sylvie motor and drive pad functionality.
- Introduced helper functions for refreshing devices and tracking configuration.
- Enhanced LED, OSC, and device control operations for clarity and consistency.
- Added fallback logic to improve device list handling during scans.
- Added deadband and minimum effective threshold logic for motor inputs to improve precision.
- Introduced crosshair and mono font styles to the drive pad for better usability and feedback.
- Implemented snapping and clamping for motor values to ensure consistent control behavior.
- Added new manual and stop button behaviors with improved state handling.
- Included preset 3 button for extended functionality.
- Introduced new Sue node UI with motion pad, LED pad, and state control buttons.
- Added sliders for Sue's angle, speed, and LED color adjustments.
- Implemented real-time control logic with motion and LED pad interactivity.
- Enhanced modularity with reusable helper functions for Sue-specific controls.
- Improved responsiveness and feedback for Sue's motion and LED updates.
… tracking and perception

- Created Flask-based control panel with modular support for vision, network, and UI components.
- Implemented real-time face tracking, OSC-based motor/LED controls, and emotion analysis.
- Added `/api/faces` route to serve vision features, emotion data, and tracking overlays.
- Developed responsive UI with motor sliders, 2D XY pad, and override toggle.
- Introduced data tagging and saving workflow for emotion-based training samples.
- Integrated ViT-based emotion detector configuration and supporting API endpoints.
…UI updates

- Integrated ViT-based emotion detector with fallback support into perception pipeline.
- Added Emotion Reactor for smoothing, mapping, and live OSC command generation.
- Updated `/api/faces` to include perception and reactor data.
- Enhanced UI with real-time flower emotion status, analysis table, and reactor tuning sliders.
- Added `/api/reactor/config` endpoint for runtime adjustments.
- Extended tests for ViT emotion, reactor logic, and UI additions.
…ling modes

- Added support for dispatching reactor commands to multiple emotion targets.
- Introduced global emotion scheduling toggle for manual control suspension.
- Updated UI with mode selection buttons (Safe, Balanced, Dramatic) to adjust reactor behavior.
- Added per-device emotion routing toggle with backend integration via `/api/devices/emotion_targets`.
- Enhanced Flask app with new endpoints for emotion override states and multi-target routing.
- Improved tests to cover new functionalities and edge cases.
@Sa1koro Sa1koro merged commit 26d75f4 into main Mar 23, 2026
4 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants