Skip to content

feat: add GestureMapper abstraction and gesture implementations#3

Open
iliasmahboub wants to merge 2 commits intom2b3:mainfrom
iliasmahboub:feature/gesture-mapper
Open

feat: add GestureMapper abstraction and gesture implementations#3
iliasmahboub wants to merge 2 commits intom2b3:mainfrom
iliasmahboub:feature/gesture-mapper

Conversation

@iliasmahboub
Copy link
Copy Markdown

@iliasmahboub iliasmahboub commented Mar 29, 2026

Summary

Decouples gesture detection from the consumer process by introducing a GestureMapper ABC that parallels the existing VideoInput pattern. Enables running multiple gestures simultaneously — something you can't do with the hardcoded tap logic.

New files

Gesture framework:

  • gestures/gesture_mapper.py — ABC with configure(), process(), reset()
  • gestures/tap_mapper.py — Extracts tap detection from latency_mp.py (same state machine, same landmarks, same threshold)
  • gestures/pinch_mapper.py — Thumb-index distance → continuous /pinch/distance + binary trigger/release
  • gestures/velocity_mapper.py — Hand speed → continuous /velocity/speed + /velocity/swipe with hysteresis
  • gestures/composite_mapper.py — Runs multiple mappers on the same hand data, merges OSC output

Pipeline integration:

  • latency_measurement/gesture_consumer.py — Drop-in consumer replacement that accepts any GestureMapper
  • preview_gestures.py — Webcam preview for testing without FLIR: python preview_gestures.py --gesture pinch

PureData patches:

  • puredata/pinch_synth.pd — Pinch distance → oscillator frequency (200–1000Hz), trigger/release → volume envelope
  • puredata/velocity_noise.pd — Hand speed → lowpass filter cutoff (100–5000Hz) on noise, swipe → visual bang

Tests (19 total, first in this repo):

  • tests/test_gesture_mappers.py — 16 unit tests for all 4 mappers including CompositeMapper
  • tests/test_tap_equivalence.py — Replays hand sequences through both TapMapper and the original latency_mp.py consumer logic, asserts identical trigger behavior

12 new files, 0 existing files modified.

How this connects to the README's future work

The README proposes separating MediaPipe workers from OSC signal production: "A separate process that takes landmarks from the worker pool, maps them to gestures, and sends OSC signals." The GestureMapper interface is the mapping layer that architecture needs — gesture_consumer.py is the consumer side, and the mapper implementations handle the landmark→gesture→OSC translation.

Test plan

  • python -m pytest tests/ -v — 19/19 passing in <0.3s
  • test_tap_equivalence.py proves TapMapper == original latency_mp.py logic
  • python preview_gestures.py --gesture tap --camera 0
  • python preview_gestures.py --gesture pinch --camera 0
  • python preview_gestures.py --gesture velocity --camera 0
  • Open puredata/pinch_synth.pd in PD, run preview with --gesture pinch
  • Open puredata/velocity_noise.pd in PD, run preview with --gesture velocity

…pers

Decouples gesture detection from the consumer process by introducing a
GestureMapper ABC (configure/process/reset) that parallels the existing
VideoInput pattern. Extracts the hardcoded tap logic into TapMapper and
adds PinchMapper (thumb-index distance) and VelocityMapper (hand speed
tracking) as new gesture types.

Includes a generic gesture_consumer drop-in for latency_mp.py, a webcam
preview script for FLIR-free testing, and 14 unit tests (first in this
repo) covering all three mappers.
CompositeMapper runs multiple gestures simultaneously on the same hand
data — the main payoff of the abstraction (can't do this with hardcoded
approach). Two PD patches demonstrate pinch->synth and velocity->noise
filter. Equivalence test replays hand sequences through both TapMapper
and the original latency_mp.py consumer logic to prove identical output.

19 tests total, all passing.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant