This Unity Plugin LayerAwareLogging operationalizes the proposed XR data pipeline taxonomy for the Meta Quest HMD devices (specifically the Quest Pro which allows eye, face, and body tracking).
For the Quest Pro, behavioral tracking data propagates through the pipeline shown in the following figure (excluding layer 4).
Because sensor-level measurements (Layer 0) and direct runtime logging (Layer 1) are inaccessible on Meta Quest Pro and Meta Quest devices in general, our plugin logs at downstream stages where software-level processing can affect data fidelity and timing.
At the engine/framework level (Layer 2), the plugin logs all tracking outputs accessed through API calls (exposed through Meta's OVRPlugin script in the Core SDK), capturing the earliest application-visible representation of each modality.
At the application level (Layer 3), logging is more nuanced and depends on the SDKs and custom logic added by the researcher or developer. The first logging point is where signals are exposed through the XR Core SDK's OVR classes (for example, OVREyeGaze, OVRBody, and OVRFaceExpressions). For body and face tracking, which undergo additional processing, data can also be logged after retargeting to the application's avatar rig and conversion to bone transforms or blendshapes. This is where our plugin logs face and body motion at Layer 3.
- Create a new Unity Project (e.g. universal 3D template).
- Inside your project, go to
Window -> Package Manager -> Add Package from git URL. - Paste this repository's URL and click
Add.
- Download the repository as a ZIP file and extract it on your disk.
- Create a new Unity Project (e.g. universal 3D template).
- Inside your project, go to
Window -> Package Manager -> Add Package from disk. - Select the
package.jsonlocated in the package directory and clickOpen.
In your project, accept the available fixes from the Project Setup Tool.

In Project Settings > XR Plug-in Management > OpenXR, enable the following profiles:

We provide two sample scenes: one for logging and one for replay. To import the samples:
- Go to
Window -> Package Manager. - Select
Layer Aware Logging. - Go to
Samples. - Import the
Tracking Sample Scene. - The scenes will get imported in
Assets/Samples/Layer Aware Logging/1.0.0/Tracking Sample Scene/Scenes.
- Open the
TrackingScene. - In the Scene Hierarchy, select
CameraRig. In theOVR Manager component, underQuest Features, set Body, Face, and Eye Tracking Support to Required. - Connect the Meta Quest Pro.
- Go to Build Settings -> Build and Run.
- Once the app starts, data gets logged automatically from Layer 2 and Layer 3 to CSV files located in the application data folder. Ensure to close the application for the data logging to finish properly.
The offline replay allows you to replay the data logged in the tracking scene on an avatar, including body pose, eye gaze and face expressions. You may import your own avatar.
An example of replay is provided in the ReplayScene in the samples of the package:
- Go to
Assets -> Samples -> Layer Aware Logging -> 1.0.0 -> Tracking Sample Scene -> Scenes. - Open
ReplayScene. - On the
RealisticCharacter, set theCSV Textproperties in the different pose providers (i.e. CSVBodyPoseProvider, CSVEyePoseProvider, CSVFaceExprProvider) to your CSV files. - When starting the application, the logged data will be played back on the avatar.
ANONYMIZED

