Skip to content

ethanporcaro/tracking-toolkit

Repository files navigation

images/cover.png

Tracking Toolkit allows you to view and record OpenXR devices inside Blender with NLA integration and SMPTE timecode.

It allows camera/object tracking with high degrees of clarity, and can also track the movement of actors for motion capture.

You can view the demonstration video on YouTube.

If you'd like to see support for a new device, let me know! You can open an issue for any questions.

Requirements

Tracking Toolkit works with Blender 5.0 and later.

Additionally, you will need an OpenXR runtime and a device to go with it.

Runtime Example Devices OpenGL Vulkan
SteamVR Valve Index, HTC VIVE
Meta Link Meta Quest 3, Oculus Rift
VDXR Meta Quest 3, Vive Focus
Monado Open Source/Community

Some runtimes do not support OpenGL as Blender's Display Graphics Backend.

Installation

  1. Download tracking_toolkit-latest.zip from the latest release
  2. Open Blender, use Edit > Preferences > Get Extensions > (Arrow in top right corner) > Install from disk, and select the downloaded zip.

Usage

  1. Navigate to the new "Track TK" tab on the sidebar.

  2. Press the Start/Connect OpenXR button to establish a connection to your runtime. If your runtime is not open, it will be launched. The list below will populate with names of detected controllers/trackers.

  3. Press Create References to add the detected trackers to your scene. This will create bones, but you can toggle the checkbox to create empties instead.

  4. At this point, a large Start Recording button will be visible. When pressed, it will record the tracker's positions until you press stop. Existing recordings will be pushed down onto a new NLA strip and muted.

Additional Information

Tracker List

Tracker List

When new trackers are detected, they will appear on the list. You can press Create References again to add them to the scene.

Even if controllers are disconnected, they will still remain on the list.

You can click the eyeball icon to show and hide the trackers in the viewport.

⚠️ If you are using Vive Trackers, you will need to use SteamVR as your OpenXR runtime. You will also need to assign a role to each tracker (waist, foot, etc.) in the SteamVR tracker settings.

You can double click a tracker's name on the list to give it a nickname. You should not directly rename empties or bones in the outliner.

You can also set nicknames in the addon preferences that apply to all future files.

References

References

References represent your tracker in the scene.

These will either be bones in an armature, or empties. You can toggle Use Bones For Trackers checkbox at any time to change this. All your existing takes/actions will be converted.

Each tracker will have two references:

  • The tracking point will follow the tracker's exact location in 3D space. It will have the keyframe data when recording. Generally, it should be left untouched, as data will be overwritten when recording.

  • The offset point is a child of the tracking point. It receives no data, and should be used for tweaking/aligning the tracker. Objects or cameras in your scene should be constrained to this offset, rather than the tracking point.

Recording

Recording

By default, trackers will be recorded at your scene's FPS. You can override this in the addon preferences.

High FPS values may cause performance issues, or prevent the recording from reflecting true tracker positions.

There is a dropdown below the record button that allows you to set a delay before data is captured.

Each time you record a new take, old ones are pushed down onto new NLA strips and muted. The action's name will be a SMPTE timecode (prefixed with the tracker's name if using empties).

Troubleshooting

Here are the solutions for common problems. Feel free to open an issue if you have additional questions or problems.

  • Disconnecting and reconnecting OpenXR with the button at the top of the Tracking Toolkit will often solve minor issues.
  • Restarting Blender and/or your runtime also fixes many common problems.
  • Manual changes to the reference objects, bones, and names can cause issues. You may need to start with a fresh scene in those cases.
Frequently Asked Questions

FAQ

Q: How many trackers do I need?

A: You only need 1 to get started, but you'll need more for things like full-body motion capture.

Q: Can I view my scene in VR?

A: No. There are technical limitations that would make this difficult, and there's no current plan to support it.

Q: Can I only use Vive Trackers without a headset?

A: Yes! Make sure to disable Pause VR when headset is idle in the SteamVR video settings. If you have no headset attached at all, you may need to enable the Mock HMD driver.

Q: Will my scene animations play during recording?

A: No. That may change in the future if needed.

Credits

Lead Developer & Maintainer Ethan Porcaro.

Technical Art & Project Management John Kraus.

Special thanks to Christopher Bruns for the amazing pyopenxr library.

Contributions

Tracking Toolkit will always remain free, and contributions are welcome. You can donate on Ko-fi and Patreon.

If you want to assist with development, see DEVELOPMENT.md.

License

This extension is licensed under GPL 3.0 or later.

Files under /images and /assets are in the Public Domain (CC0).

Branding files are public domain, but trademark rights are reserved.

Copyright (C) 2026 Ethan Porcaro

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see <https://www.gnu.org/licenses/>.

Projects using Tracking Toolkit

About

Tracking Toolkit allows you to view and record OpenXR devices inside Blender with NLA integration and SMPTE timecode. It allows camera/object tracking with high degrees of clarity, and can also track the movement of actors for motion capture.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

Contributors

Languages