Illustration by Angela Torchio
P5.js on mobile provides unique opportunities and challenges. The main P5 framework does an excellent job of making it easy to read data from various phone inputs and sensors, however it doesn't deal with the realities of contemporary browser's built in gestures and security protocols. That's where this library comes in:
- Simplifies accessing phone hardware from the browser (accelerometers, gyroscopes, microphone, vibration motor)
- Simplifies disabling default phone gestures (Zoom, refresh, back, etc)
- Simplifies enabling audio output
- Simplifies using an on-screen console to display errors and debug info
This library simplifies access to the following p5.js mobile sensor and audio commands:
Touch/Pointer Events:
mousePressed()- Called when a press/touch begins (works for both mouse and touch in p5.js 1.x and 2.0)mouseReleased()- Called when a press/touch ends (works for both mouse and touch in p5.js 1.x and 2.0)touchStarted()- Called when a touch begins (p5.js 1.x only)touchEnded()- Called when a touch ends (p5.js 1.x only)
Device Motion & Orientation:
rotationX- Device tilt forward/backwardrotationY- Device tilt left/rightrotationZ- Device rotation around screenaccelerationX- Acceleration left/rightaccelerationY- Acceleration up/downaccelerationZ- Acceleration forward/backdeviceShaken()- Shake detection eventdeviceMoved()- Movement detection eventsetShakeThreshold()- Set shake detection sensitivitysetMoveThreshold()- Set movement detection sensitivity
Audio Input (requires p5.sound):
p5.AudioIn()- Audio input objectgetLevel()- Current audio input level
- iOS 13+ (Safari)
- Android 7+ (Chrome)
- Chrome 80+
- Safari 13+
- Firefox 75+
p5-phone supports both p5.js 1.x and p5.js 2.0+.
| Feature | p5.js 1.x | p5.js 2.0+ |
|---|---|---|
| Permission UI (Tap/Button/Canvas/Banner/Custom) | ✅ | ✅ |
| Motion sensors (rotationX/Y/Z, accelerationX/Y/Z) | ✅ | ✅ |
| Microphone / Speech / Sound | ✅ | ✅ |
| Camera (PhoneCamera) | ✅ | ✅ |
| Vibration | ✅ | ✅ |
| NFC Tag Reading (Android only) | ✅ | ✅ |
| Debug console | ✅ | ✅ |
| lockGestures() | ✅ | ✅ |
touchStarted() / touchEnded() |
✅ | ❌ Use mousePressed() / mouseReleased() |
p5.registerAddon() |
❌ | ✅ (auto-detected) |
Key change in p5.js 2.0: Touch-specific callbacks (touchStarted, touchMoved, touchEnded) are no longer dispatched. The unified Pointer API routes all input (mouse + touch) through mousePressed(), mouseDragged(), and mouseReleased(). These mouse callbacks work in both p5.js 1.x and 2.0, so use them for forward-compatible code.
p5-phone automatically detects the p5.js version and adjusts its internal touch override behavior accordingly. No configuration needed.
- Link for Interactive Examples
- Browser Compatibility
- p5.js Version Compatibility
- CDN (Recommended)
- Basic Setup
- API Reference
- Permission UI Styles
- Troubleshooting / FAQ
<!-- Minified version (recommended) -->
<script src="https://cdn.jsdelivr.net/npm/p5-phone@1.9.0/dist/p5-phone.min.js"></script>
<!-- Development version (larger, with comments) -->
<!-- <script src="https://cdn.jsdelivr.net/npm/p5-phone@1.9.0/dist/p5-phone.js"></script> --><!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Mobile p5.js App</title>
<!-- Basic CSS to remove browser defaults and align canvas -->
<style>
body {
margin: 0;
padding: 0;
overflow: hidden;
}
</style>
<!-- Load p5.js library -->
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.10/p5.min.js"></script>
<!-- For p5.js 2.0: <script src="https://cdn.jsdelivr.net/npm/p5@2/lib/p5.min.js"></script> -->
<!-- Load p5-phone library -->
<script src="https://cdn.jsdelivr.net/npm/p5-phone@1.9.0/dist/p5-phone.min.js"></script>
</head>
<body>
<!-- Load the p5.js sketch -->
<script src="sketch.js"></script>
</body>
</html>let mic;
let mySound;
function preload() {
// Load sound file if needed
// mySound = loadSound('assets/sound.mp3');
}
function setup() {
// Show debug panel FIRST to catch setup errors
showDebug();
createCanvas(windowWidth, windowHeight);
// Lock mobile gestures to prevent browser interference
lockGestures();
// Enable motion sensors with tap-to-start
enableGyroTap('Tap to enable motion sensors');
// Enable microphone with tap-to-start (also enables sound output)
mic = new p5.AudioIn();
enableMicTap('Tap to enable microphone');
// OR enable sound output only (no microphone input)
// enableSoundTap('Tap to enable sound');
}
function draw() {
background(220);
// Always check status before using hardware features
if (window.sensorsEnabled) {
// Use device rotation and acceleration
fill(255, 0, 0);
circle(width/2 + rotationY * 5, height/2 + rotationX * 5, 50);
}
if (window.micEnabled) {
// Use microphone input
let level = mic.getLevel();
fill(0, 255, 0);
rect(10, 10, level * 200, 20);
}
if (window.soundEnabled) {
// Safe to play sounds
// mySound.play();
}
}
// Prevent default touch behavior (optional but recommended)
// Use mousePressed/mouseReleased — works in both p5.js 1.x and 2.0
function mousePressed() {
return false;
}
function mouseReleased() {
return false;
}// Essential mobile setup
lockGestures() // Prevent browser gestures (call in setup())
// Motion sensor activation
enableGyroTap(message) // Tap anywhere to enable sensors
enableGyroButton(text) // Button-based sensor activation
// Microphone activation
enableMicTap(message) // Tap anywhere to enable microphone
enableMicButton(text) // Button-based microphone activation
// Sound output activation (no microphone input)
enableSoundTap(message) // Tap anywhere to enable sound playback
enableSoundButton(text) // Button-based sound activation
// Speech recognition activation (Web Speech API)
enableSpeechTap(message) // Tap anywhere to enable speech
enableSpeechButton(text) // Button-based speech activation
// Combined activation (motion + microphone)
enableAllTap(message) // Tap anywhere to enable both
enableAllButton(text) // Button-based combined activation
// Vibration motor (Android only)
enableVibrationTap(message) // Tap anywhere to enable vibration
enableVibrationButton(text) // Button-based vibration activation
vibrate(pattern) // Trigger vibration (duration or pattern array)
stopVibration() // Stop any ongoing vibration
// Camera (ML5 integration)
createPhoneCamera(active, mirror, mode) // Create camera instance
enableCameraTap(message) // Tap to enable camera
enableCameraButton(text) // Button-based camera activation
// --- Alternative Permission UI Styles (v1.7.0) ---
// Canvas-first-touch — permissions fire on first canvas interaction
enableGyroCanvas(message) // Also: enableMicCanvas, enableSoundCanvas,
// enableSpeechCanvas, enableVibrationCanvas,
// enableAllCanvas, enableCameraCanvas
// Banner — slim notification bar at top or bottom of screen
enableGyroBanner(message, position) // position: 'top' or 'bottom'
// Also: enableMicBanner, enableSoundBanner,
// enableSpeechBanner, enableVibrationBanner,
// enableAllBanner, enableCameraBanner
// Custom element binding — attach to your own DOM element
enableGyroOn(selector) // e.g., enableGyroOn('#my-button')
// Also: enableMicOn, enableSoundOn, enableSpeechOn,
// enableVibrationOn, enableAllOn, enableCameraOn
// Status variables (check these in your code)
window.sensorsEnabled // Boolean: true when motion sensors are active
window.micEnabled // Boolean: true when microphone is active
window.soundEnabled // Boolean: true when sound output is active
window.speechEnabled // Boolean: true when speech recognition is active
window.vibrationEnabled // Boolean: true when vibration is available (Android only)
// Debug system
showDebug() // Show on-screen debug panel with automatic error catching
hideDebug() // Hide debug panel
toggleDebug() // Toggle panel visibility
debug(...args) // Console.log with on-screen display and timestamps
debugError(...args) // Display errors with red styling
debugWarn(...args) // Display warnings with yellow styling
debug.clear() // Clear debug messagesp5.js Namespace Support: All functions are also available as p5.prototype methods:
// You can use either syntax:
lockGestures(); // Global function (recommended)
this.lockGestures(); // p5.js instance method
// Both approaches work identically
enableGyroTap('Tap to start');
this.enableGyroTap('Tap to start');Purpose: Check whether permissions have been granted and sensors are active.
Variables:
window.sensorsEnabled- Boolean indicating if motion sensors are activewindow.micEnabled- Boolean indicating if microphone is activewindow.soundEnabled- Boolean indicating if sound output is activewindow.speechEnabled- Boolean indicating if speech recognition is activewindow.vibrationEnabled- Boolean indicating if vibration is available (Android only)window.nfcEnabled- Boolean indicating if NFC scanning is active (Android only)
Usage:
function draw() {
// Always check before using sensor data
if (window.sensorsEnabled) {
// Safe to use rotationX, rotationY, accelerationX, etc.
let tilt = rotationX;
}
if (window.micEnabled) {
// Safe to use microphone
let audioLevel = mic.getLevel();
}
if (window.soundEnabled) {
// Safe to play sounds
mySound.play();
}
if (window.vibrationEnabled) {
// Safe to use vibration (Android only)
vibrate(50);
}
if (window.nfcEnabled) {
// NFC scanning is active (Android only)
// Tag data arrives via nfcRead() callback
}
}
// You can also use them for conditional UI
function setup() {
enableGyroTap('Tap to enable motion');
// Show different instructions based on status
if (!window.sensorsEnabled) {
debug("Motion sensors not yet enabled");
}
}Purpose: Prevents unwanted mobile browser gestures that can interfere with your p5.js app.
When to use: Call once in your setup() function after creating the canvas.
What it blocks:
- Pinch-to-zoom - Prevents users from accidentally zooming the page
- Pull-to-refresh - Stops the browser refresh gesture when pulling down
- Swipe navigation - Disables back/forward swipe gestures
- Long-press context menus - Prevents copy/paste menus from appearing
- Text selection - Stops accidental text highlighting on touch and hold
- Double-tap zoom - Eliminates double-tap to zoom behavior
function setup() {
createCanvas(windowWidth, windowHeight);
lockGestures(); // Essential for smooth mobile interaction
}Purpose: Enable device motion and orientation sensors with user permission handling.
Commands:
enableGyroTap(message)- Tap anywhere on screen to enable sensorsenableGyroButton(text)- Creates a button with custom text to enable sensors
Usage:
// Tap-to-enable (recommended)
enableGyroTap('Tap to enable motion sensors');
// Button-based activation
enableGyroButton('Enable Motion');Available p5.js Variables (when window.sensorsEnabled is true):
| Variable | Description | Range/Units |
|---|---|---|
rotationX |
Device tilt forward/backward | -180° to 180° |
rotationY |
Device tilt left/right | -180° to 180° |
rotationZ |
Device rotation around screen | -180° to 180° |
accelerationX |
Acceleration left/right | m/s² |
accelerationY |
Acceleration up/down | m/s² |
accelerationZ |
Acceleration forward/back | m/s² |
deviceShaken |
Shake detection event | true when shaken |
deviceMoved |
Movement detection event | true when moved |
Important: All motion sensor variables, including deviceShaken and deviceMoved, are only available when window.sensorsEnabled is true. Always check this status before using any motion data.
Example:
function draw() {
// CRITICAL: Always check window.sensorsEnabled first
if (window.sensorsEnabled) {
// Tilt-controlled circle
let x = width/2 + rotationY * 3;
let y = height/2 + rotationX * 3;
circle(x, y, 50);
// Shake detection - only works when sensors are enabled
if (deviceShaken) {
background(random(255), random(255), random(255));
}
// Movement detection - also requires sensors to be enabled
if (deviceMoved) {
fill(255, 0, 0);
}
} else {
// Show fallback when sensors not enabled
text('Tap to enable motion sensors', 20, 20);
}
}Purpose: Enable device microphone with user permission handling for audio-reactive applications.
Important: Microphone examples require the p5.sound library. Add this script tag to your HTML:
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.0/addons/p5.sound.min.js"></script>Commands:
enableMicTap(message)- Tap anywhere on screen to enable microphoneenableMicButton(text)- Creates a button with custom text to enable microphone
Usage:
// Tap-to-enable (recommended)
enableMicTap('Tap to enable microphone');
// Button-based activation
enableMicButton('Enable Audio');Available p5.js Variables (when window.micEnabled is true):
| Variable | Description | Range |
|---|---|---|
p5.AudioIn() |
Audio input object (stored in mic) |
Object |
mic.getLevel() |
Current audio input level | 0.0 to 1.0 |
Example:
let mic;
function setup() {
createCanvas(windowWidth, windowHeight);
// Create a new p5.AudioIn() instance
mic = new p5.AudioIn();
// Enable microphone with tap
enableMicTap();
}
function draw() {
if (window.micEnabled) {
// The mic object is a p5.AudioIn() instance
// Audio-reactive visualization
let level = mic.getLevel();
let size = map(level, 0, 1, 10, 200);
background(level * 255);
circle(width/2, height/2, size);
}
}Purpose: Enable audio playback without requiring microphone input. Perfect for playing sounds, music, synthesizers, and audio effects in mobile browsers.
Important: Sound examples require the p5.sound library. Add this script tag to your HTML:
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.0/addons/p5.sound.min.js"></script>Commands:
enableSoundTap(message)- Tap anywhere on screen to enable sound playbackenableSoundButton(text)- Creates a button with custom text to enable sound
Usage:
// Tap-to-enable (recommended)
enableSoundTap('Tap to enable sound');
// Button-based activation
enableSoundButton('Enable Sound');When to use Sound vs. Microphone:
- Use
enableSoundfor: Playing audio files, synthesizers, oscillators, sound effects - Use
enableMicfor: Recording audio, audio-reactive visualizations, voice input - Note:
enableMicalso enables sound output, so you don't need both
Example:
let mySound;
function preload() {
// Load audio file
mySound = loadSound('assets/sound.mp3');
}
function setup() {
createCanvas(windowWidth, windowHeight);
// Enable sound playback with tap
enableSoundTap('Tap to enable sound');
}
function draw() {
background(220);
if (window.soundEnabled) {
text('Tap anywhere to play sound', 20, 20);
} else {
text('Waiting for sound activation...', 20, 20);
}
}
function mousePressed() {
// Check if sound is enabled before playing
if (window.soundEnabled && !mySound.isPlaying()) {
mySound.play();
}
}Purpose: Access the device's vibration motor for haptic feedback and tactile interactions.
- ✅ Android - Full support in Chrome and most Android browsers
- ❌ iOS - Not supported (Vibration API not available on iOS devices)
Important: The vibration feature will automatically detect if the device supports vibration. On iOS or unsupported devices, window.vibrationEnabled will be false and vibration calls will be safely ignored with console warnings.
Commands:
enableVibrationTap(message)- Tap anywhere on screen to enable vibrationenableVibrationButton(text)- Creates a button with custom text to enable vibrationvibrate(pattern)- Trigger vibration with a duration (ms) or pattern arraystopVibration()- Stop any ongoing vibration
Usage:
function setup() {
createCanvas(windowWidth, windowHeight);
// Enable vibration with tap (Android only)
enableVibrationTap('Tap to enable vibration');
// Or use a button
// enableVibrationButton('Enable Haptics');
}
function draw() {
background(220);
if (window.vibrationEnabled) {
text('Vibration ready! Tap anywhere', 20, 20);
} else {
text('Vibration not available', 20, 20);
}
}
function mousePressed() {
if (window.vibrationEnabled) {
// Simple vibration - 50ms pulse
vibrate(50);
}
}Vibration Patterns:
// Single vibration (duration in milliseconds)
vibrate(100); // Vibrate for 100ms
// Pattern: [vibrate, pause, vibrate, pause, ...]
vibrate([100, 50, 100]); // Short-short pattern
vibrate([200, 100, 200, 100, 200]); // Triple pulse
vibrate([50, 50, 50, 50, 500]); // Quick taps then long
// Stop any ongoing vibration
stopVibration();Common Use Cases:
// Haptic feedback for button presses
function mousePressed() {
if (window.vibrationEnabled) {
vibrate(20); // Quick tap feedback
}
}
// Touch zones with different haptic patterns
function mousePressed() {
if (window.vibrationEnabled) {
if (mouseX < width/2) {
vibrate(50); // Left side - short pulse
} else {
vibrate([50, 30, 50]); // Right side - double pulse
}
}
return false;
}
// Collision detection
function checkCollision() {
if (collision && window.vibrationEnabled) {
vibrate([100, 50, 100, 50, 200]); // Alert pattern
}
}
// Game events
function gameOver() {
if (window.vibrationEnabled) {
vibrate(500); // Long vibration for game over
}
}Best Practices:
- Use short vibrations (20-100ms) for subtle feedback
- Use patterns for more complex haptic responses
- Always check
window.vibrationEnabledbefore callingvibrate() - Don't overuse - vibration can quickly drain battery
- Test on Android devices as iOS doesn't support vibration
Purpose: Read NFC (Near Field Communication) tags using the Web NFC API. Ideal for interactive installations, scavenger hunts, or any sketch that responds to physical NFC tags.
- ✅ Android - Chrome 89+ and Samsung Internet 15+
- ❌ iOS - Not supported (Web NFC API not available on iOS)
- Requires HTTPS — NFC is blocked on insecure origins
Important: The Web NFC API requires user activation (a tap or click) before scanning can begin — the same pattern used by all other p5-phone permission functions. On unsupported devices/browsers, window.nfcEnabled will be false and calls will be safely ignored with console warnings.
Commands:
enableNfcTap(message)- Tap anywhere on screen to enable NFC scanningenableNfcButton(text)- Creates a button with custom text to enable NFCstopNfc()- Stop NFC scanning
Status Variables:
window.nfcEnabled- Boolean indicating if NFC scanning is activewindow.lastNfcMessage- Object containing the most recently read tag's datawindow.lastNfcSerialNumber- Serial number string of the most recently read tag
User Callback:
Define an nfcRead() function in your sketch to receive tag data when a tag is scanned:
function nfcRead(message, serialNumber) {
// message.serialNumber — tag serial number
// message.records — array of NDEF records, each with:
// .recordType — 'text', 'url', 'mime', etc.
// .data — decoded content (string for text/url, object for JSON, raw for others)
// .mediaType — MIME type (for 'mime' records)
// .id — record id (if present)
// .raw — original DataView
}Usage:
let tagText = 'No tag scanned yet';
function setup() {
createCanvas(windowWidth, windowHeight);
lockGestures();
enableNfcTap('Tap to enable NFC');
}
function draw() {
background(220);
textAlign(CENTER, CENTER);
textSize(20);
if (!window.nfcEnabled) {
text('NFC not active', width / 2, height / 2);
} else {
text(tagText, width / 2, height / 2);
text('Hold an NFC tag near your phone', width / 2, height / 2 + 40);
}
}
function nfcRead(message, serialNumber) {
tagText = 'Tag: ' + serialNumber;
for (let record of message.records) {
if (record.recordType === 'text' || record.recordType === 'url') {
tagText += '\n' + record.data;
}
}
}Record Types:
NFC tags contain NDEF records. The most common types are:
| Record Type | record.data Contains |
Example |
|---|---|---|
text |
Decoded string | "Hello World" |
url |
Decoded URL string | "https://example.com" |
mime |
Decoded string or parsed JSON | { id: 42 } |
| other | Raw DataView |
Binary data |
Best Practices:
- Always check
window.nfcEnabledbefore relying on NFC features - Use the
nfcRead()callback for real-time tag processing - Use
window.lastNfcMessageindraw()for displaying the most recent tag - Test on Android devices with Chrome — NFC is not available on iOS or desktop browsers
- Tags must be NDEF-formatted to be read by the Web NFC API
Purpose: Enable the Web Speech API for voice input and speech-to-text in mobile browsers.
Important: This does NOT create a p5.AudioIn object — it only activates the audio context needed for the Web Speech API. You must create your own p5.SpeechRec object after enabling.
Commands:
enableSpeechTap(message)- Tap anywhere on screen to enable speech recognitionenableSpeechButton(text)- Creates a button with custom text to enable speech
Usage:
let speechRec;
function setup() {
createCanvas(windowWidth, windowHeight);
// Enable speech recognition
enableSpeechTap('Tap to enable speech recognition');
}
// Use userSetupComplete() to know when permissions are ready
function userSetupComplete() {
if (window.speechEnabled) {
speechRec = new p5.SpeechRec('en-US');
speechRec.continuous = true;
speechRec.interimResults = true;
speechRec.onResult = gotSpeech;
speechRec.start();
}
}
function gotSpeech() {
if (speechRec.resultValue) {
debug('You said:', speechRec.resultString);
}
}Purpose: Enable both motion sensors and microphone with a single permission prompt, reducing the number of taps required.
Commands:
enableAllTap(message)- Tap anywhere to enable motion sensors + microphoneenableAllButton(text)- Button-based combined activation
Usage:
let mic;
function setup() {
createCanvas(windowWidth, windowHeight);
mic = new p5.AudioIn();
lockGestures();
// One tap enables both sensors and microphone
enableAllTap('Tap to enable sensors & microphone');
}
function draw() {
background(220);
if (window.sensorsEnabled) {
circle(width/2 + rotationY * 3, height/2 + rotationX * 3, 50);
}
if (window.micEnabled) {
let level = mic.getLevel();
rect(10, 10, level * 200, 20);
}
}Purpose: Simplified camera access optimized for ML5.js machine learning models (FaceMesh, HandPose, BodyPose, etc.). Handles camera initialization, coordinate mapping, mirroring, and display modes automatically.
Key Features:
- Automatic Coordinate Mapping - ML5 keypoints automatically mapped to canvas coordinates
- Mirror Support - Handles front camera mirroring for natural interaction
- Display Modes - Multiple video sizing options (fitHeight, cover, contain, fixed)
- ML5 Optimized - Direct integration with ML5 v1.x models
- Auto-initialization - Camera starts automatically when permissions are granted
Commands:
| Function | Purpose | Parameters |
|---|---|---|
createPhoneCamera(active, mirror, mode) |
Create new camera instance | active: 'user' or 'environment' mirror: true/false mode: 'fitHeight', 'cover', 'contain', 'fixed' |
enableCameraTap(message) |
Tap to enable camera | Optional message string |
cam.onReady(callback) |
Execute code when camera ready | Callback function |
cam.mapKeypoint(keypoint) |
Map single ML5 keypoint to screen | ML5 keypoint object |
cam.mapKeypoints(keypoints) |
Map array of ML5 keypoints | Array of ML5 keypoints |
Properties:
| Property | Description | Type |
|---|---|---|
cam.ready |
Camera initialization status | Boolean |
cam.video |
p5.js video element | p5.Element |
cam.active |
Current camera ('user'/'environment') | String |
cam.mirror |
Mirror state | Boolean |
cam.mode |
Display mode | String |
cam.width |
Video width | Number |
cam.height |
Video height | Number |
Basic Setup:
let cam;
let facemesh;
let faces = [];
function setup() {
createCanvas(windowWidth, windowHeight);
// Create camera: front camera, mirrored, fit to canvas height
cam = createPhoneCamera('user', true, 'fitHeight');
// Enable camera (auto-starts if permission granted)
enableCameraTap();
// Start ML5 when camera is ready
cam.onReady(() => {
let options = {
maxFaces: 1,
refineLandmarks: false,
flipHorizontal: false // cam.mapKeypoint() handles mirroring
};
facemesh = ml5.faceMesh(options, modelLoaded);
});
}
function modelLoaded() {
// Start detection - use cam.videoElement for ML5
facemesh.detectStart(cam.videoElement, (results) => {
faces = results;
});
}
function draw() {
background(220);
// Draw camera feed
if (cam.ready) {
image(cam, 0, 0); // PhoneCamera handles positioning automatically
}
// Draw tracked face keypoints
if (faces.length > 0) {
let face = faces[0];
// Map nose tip keypoint (index 1) to screen coordinates
let nose = cam.mapKeypoint(face.keypoints[1]);
// Use coordinates for interaction
fill(255, 0, 0);
circle(nose.x, nose.y, 30);
// Map all keypoints at once
let allPoints = cam.mapKeypoints(face.keypoints);
for (let point of allPoints) {
circle(point.x, point.y, 3);
}
}
}Display Modes:
| Mode | Behavior |
|---|---|
'fitHeight' |
Scale video to canvas height (default, recommended) |
'cover' |
Fill entire canvas (may crop video) |
'contain' |
Fit entire video in canvas (may show letterboxing) |
'fixed' |
Fixed size (set with cam.fixedWidth, cam.fixedHeight) |
Coordinate Mapping:
The mapKeypoint() and mapKeypoints() functions automatically handle:
- Video-to-canvas scaling
- Mirror transformation (for front camera)
- Offset positioning (for different display modes)
- 3D coordinates (preserves z-depth from BlazePose)
// Single keypoint
let nose = cam.mapKeypoint(face.keypoints[1]);
console.log(nose.x, nose.y, nose.z); // Screen coordinates + depth
// Multiple keypoints
let hands = cam.mapKeypoints(hand.keypoints);
hands.forEach(point => {
circle(point.x, point.y, 5);
});ML5 Model Examples:
// FaceMesh (468 keypoints)
let options = { maxFaces: 1, refineLandmarks: false, flipHorizontal: false };
facemesh = ml5.faceMesh(options, modelLoaded);
// HandPose (21 keypoints per hand)
let options = { maxHands: 2, runtime: 'mediapipe', flipHorizontal: false };
handpose = ml5.handPose(options, modelLoaded);
// BodyPose (33 keypoints with 3D)
let options = { modelType: 'MULTIPOSE_LIGHTNING', flipped: false };
bodypose = ml5.bodyPose('BlazePose', options, modelLoaded);Important Notes:
- Always set
flipHorizontal: falsein ML5 options (PhoneCamera handles mirroring) - Use
cam.videoElement(native HTML video element) when passing to ML5'sdetectStart() - Check
cam.readybefore using video or drawing keypoints - Call
enableCameraTap()to handle camera permissions automatically
Purpose: Essential on-screen debugging system for mobile development where traditional browser dev tools aren't accessible. Provides automatic error catching, timestamped logging, and color-coded messages.
Why use it: Mobile browsers often hide JavaScript errors, making debugging difficult. This system displays all errors, warnings, and custom messages directly on your mobile screen with timestamps and color coding.
Commands:
| Function | Purpose | Example |
|---|---|---|
showDebug() |
Show debug panel and enable error catching | showDebug() |
hideDebug() |
Hide debug panel | hideDebug() |
toggleDebug() |
Toggle panel visibility | toggleDebug() |
debug(...args) |
Log messages (white text) | debug("App started", frameRate()) |
debugError(...args) |
Display errors (red text) | debugError("Connection failed") |
debugWarn(...args) |
Display warnings (yellow text) | debugWarn("Low battery") |
debug.clear() |
Clear all messages | debug.clear() |
Key Features:
- Automatic Error Catching - JavaScript errors automatically displayed with red styling
- Error Location - Shows filename and line number for easy debugging
- Timestamps - All messages include precise timestamps
- Color Coding - Errors (red), warnings (yellow), normal messages (white)
- Mobile Optimized - Touch-friendly interface that works on small screens
- Keyboard Shortcuts - Press 'D' to toggle, 'C' to clear (when debug is enabled)
Critical Setup:
function setup() {
// IMPORTANT: Call showDebug() FIRST to catch setup errors
showDebug();
createCanvas(windowWidth, windowHeight);
// Any errors after this point will be automatically caught and displayed
}Usage Examples:
// Basic logging
debug("Touch at:", mouseX, mouseY);
debug("Sensors enabled:", window.sensorsEnabled);
// Error handling
debugError("Failed to load image");
debugWarn("Frame rate dropping:", frameRate());
// Objects and arrays
debug("Touch points:", touches);
debug({rotation: rotationX, acceleration: accelerationX});Beyond the default Tap and Button styles, p5-phone v1.7.0 provides three additional ways to present permission prompts. Each permission type (sensors, microphone, speech, all, camera) has all five variants.
The canvas displays a centered message until the user taps. Great for "full-screen tap to start" experiences where you want the canvas to feel like a splash screen.
Commands:
enableSensorCanvas(message)enableMicCanvas(message)enableSpeechCanvas(message)enableNfcCanvas(message)enableAllCanvas(message)enableCameraCanvas(message)
Usage:
function setup() {
createCanvas(windowWidth, windowHeight);
lockGestures();
enableSensorCanvas('Tap canvas to begin');
}
function draw() {
if (!window.sensorsEnabled) return;
background(220);
circle(width/2 + rotationY * 3, height/2 + rotationX * 3, 50);
}A styled banner slides in from the top of the screen with an animated entrance. After the user taps, the banner slides away and permissions are activated. Ideal when you want to keep your canvas visible underneath the prompt.
Commands:
enableSensorBanner(message)enableMicBanner(message)enableSpeechBanner(message)enableNfcBanner(message)enableAllBanner(message)enableCameraBanner(message)
Usage:
function setup() {
createCanvas(windowWidth, windowHeight);
lockGestures();
enableSensorBanner('Tap here to enable motion sensors');
}
function draw() {
background(220);
if (window.sensorsEnabled) {
text('Rotation: ' + rotationX.toFixed(1), 20, 60);
}
}Bind the permission activation to any existing HTML element on the page using a CSS selector. This gives you full control over the look and placement of the trigger. The element is hidden after successful activation.
Commands:
enableSensorOn(selector)enableMicOn(selector)enableSpeechOn(selector)enableNfcOn(selector)enableAllOn(selector)enableCameraOn(selector)
Usage (HTML):
<button id="start-btn" style="font-size:24px; padding:20px;">
Start Experience
</button>Usage (sketch.js):
function setup() {
createCanvas(windowWidth, windowHeight);
lockGestures();
enableAllOn('#start-btn');
}
function draw() {
if (!window.sensorsEnabled) return;
background(220);
circle(width/2 + rotationY * 3, height/2 + rotationX * 3, 50);
}| Style | Best For | How It Works |
|---|---|---|
| Tap | Quick prototypes | Full-screen transparent overlay |
| Button | Clear UI | Auto-generated styled button |
| Canvas | Splash screens | Message drawn on the p5 canvas |
| Banner | Polished apps | Animated slide-in banner |
| Custom | Custom designs | Bind to your own HTML element |
iOS Safari requires a user gesture (tap, click) before granting access to motion sensors and the microphone. This is a browser security requirement — it cannot be bypassed. Android does not have this restriction, but the tap/button still works on Android (it's a no-op).
- Make sure you're serving over HTTPS — motion sensors and microphone are blocked on insecure origins.
- Check that you've called one of the
enable...functions insetup(). - On iOS, check Settings → Safari → Motion & Orientation Access is enabled.
Make sure you're calling enableConsole() in your sketch. The console overlay appears at the bottom of the screen. You can use debug(), debugWarn(), and debugError() to log to it.
Define a userSetupComplete() function — it's called automatically after the user taps and all requested permissions have been granted:
function userSetupComplete() {
debug('All permissions granted!');
debug('Sensors:', window.sensorsEnabled);
debug('Mic:', window.micEnabled);
}You can also check the status variables at any time:
window.sensorsEnabledwindow.micEnabledwindow.speechEnabled
- Ensure your page is served over HTTPS.
- Check that you're using
new PhoneCamera(this)in your sketch. - Grant camera permission in the browser when prompted.
- Some iOS versions require the user to explicitly allow camera access in Settings → Safari → Camera.
The Vibration API is not supported on iOS. Use navigator.vibrate only as an enhancement for Android devices. Check 'vibrate' in navigator before calling it.