This repository contains algorithms designed for map-based robot localization, specifically when dealing with maps composed of triangle meshes or complete scene graphs. These maps may be provided by architects who have designed the building in which the robot operates, or they can be autonomously generated by the robot through Simultaneous Localization and Mapping (SLAM) methods. It's crucial to note that map-based localization differs from SLAM; it focuses on estimating the robot's pose within a potentially large map, whether the initial pose is roughly known (tracking) or entirely unknown from the start aka kidnapped robot problem. Map-based localization is essential for precisely planning the robot's missions on a given map.
MICP-L: Mesh-based ICP for Robot Localization Using Hardware-Accelerated Ray Casting. An approach to directly register range sensor data to a mesh in order to localize a mobile robot using hardware-accelerated ray casting correspondences (See publications).
Hilti: 6DoF Localization | MulRan: Large-scale scenes |
---|---|
Requirements:
- At least one range sensor is equipped and running
- Triangle mesh as map
- Prior odometry estimation of the robot given as TF
IMU prior is also possible as long as it is integrated as TF-Transform, e.g. with Madgwick Filter.
Read more details here or if you seek for a quick start and you have no robot available, go to our hands-on examples: https://github.com/amock/rmcl_examples.
Please reference the following paper when using the MICP-L method in your scientific work.
@inproceedings{mock2024micpl,
title={{MICP-L}: Mesh-based ICP for Robot Localization Using Hardware-Accelerated Ray Casting},
author={Mock, Alexander and Wiemann, Thomas and Pütz, Sebastian and Hertzberg, Joachim},
booktitle={2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year={2024},
pages={10664-10671},
doi={10.1109/IROS58592.2024.10802360}
}
The paper is available on IEEE Xplore and as preprint on arXiv. The experiments are available at https://github.com/amock/micp_experiments, but they are primarily compatible with the ROS 1 version.
See the older branches or commits for reference.
Ray Casting Monte Carlo Localization (RMCL) provides a practical, real-time implementation of Monte Carlo Localization (MCL) for global robot localization, accelerated by high-performance ray tracing over triangle meshes and geometric scene graphs. MCL has a decades-long track record; our focus is making it easy to deploy and tune on real robots. The pipeline scales across diverse hardware with parameters to meet tight compute and memory budgets (including for our smallest robots). The documentation begins with hands-on usage and configuration of rmcl_localization_node
, followed by a concise overview of the underlying concepts and design choices.
See the longer pre-release video here. Read more details here.
Since Ray MCL is still in a pre-release stage, no formal publication is available at this time. When using RMCL in your scientific work, please reference one of the closely related projects such as rmagine or MICP-L, as many of their concepts have been incorporated into RMCL. Alternatively, you may cite this software package directly as follows:
@software{amock2025rmcl,
author = {Alexander Mock},
title = {{RMCL: Mobile Robot Localization in 3D Triangle Meshes \& Geometric Scene Graphs}},
license = {BSD-3-Clause},
url = {https://github.com/uos/rmcl},
year = {2025},
month = {10}
}
Dependencies:
- ROS 2 (check compatible branches)
- Download and put Rmagine (v >= 2.3.2) into your ROS workspace.
- Recommended: Install OptiX backend if NVIDIA GPU is available.
- Optional for functionality, but required for visualizations: mesh_tools.
Clone this repository into your ROS workspace and build it.
colcon build
RMCL Branch | Supported ROS 2 versions |
---|---|
main | humble, jazzy |
To navigate a robot automatically and safely through uneven terrain, the combination RMCL + MeshNav is very suitable: https://github.com/naturerobots/mesh_navigation. As we presented at ROSCon 2023: