Skip to content

IMRL/Super-LiDAR-Intensity

Repository files navigation

Super-LiDAR-Intensity

This is the code repository for the IEEE-RAL'26 paper "Super LiDAR Intensity for Robotic Perception" Super LiDAR Intensity for Robotic Perception

News

-2026-03-13: The code, model weights, and data have been released.

Dataset Display

Dataset 1

Dataset 2

Prerequisites

Ubuntu and ROS

This repository has been tested on Ubuntu 20.04 + ROS Noetic.

Dependencies

Build

mkdir -p super_ws/src
cd ~/super_ws/src
git clone https://github.com/IMRL/Super-LiDAR-Intensity.git
catkin build
source ~/super_ws/devel/setup.bash

Setup Python Environment

Create Conda Environment

conda env create -f SuperLidarIntensity.yaml -y
conda activate super

Usage:

Data Preparation

Generate intensity images from ROS bag

The Super package provides tools to convert LiDAR point clouds into panoramic/virtual-camera intensity images for training and evaluation. We also provide test_bag for intensity image generation.

roslaunch Super intensity_image.launch

Training & Inference

The dataset is available at here

conda activate super
cd ~/super_ws/src/Super/scripts/panoramic_virtualCamera

for training:

 python main.py --config config_panoramic.example.yaml

for inference:

python infer.py --config config_panoramic.example.yaml  --view_type panoramic

SuperLidarIntensity for Specific Applications

The model weight of "loop closure detection is available at super_panoramic.ts and orbvoc.dbow3. Please place them in the corresponding paths.

LoopClosure detection for single sequence:

The ROS bag for "LoopClosure detection for single sequence" is available at here

cd ~/super_ws
source devel/setup.zsh

roslaunch fast_lio mapping_avia.launch
roslaunch imaging_lidar_place_recognition run.launch
rosbag play --pause --clock example.bag

LoopClosure detection for different sequences: (eg. day and night)

The ROS bag for "LoopClosure detection for different sequence" is available at here

cd ~/super_ws
source devel/setup.zsh

roslaunch fast_lio mapping_avia.launch
roslaunch diff_imaging_lidar_place_recognition run.launch
rosbag play --pause --clock example.bag

Traffic Lane Detection:

The model weight of "Traffic Lane Detection" is available at super_virtual_camera.ts and lane.ts. Please place them in the corresponding paths.

The ROS bag for "Traffic Lane Detection" is available at here

cd ~/super_ws
source devel/setup.zsh

roslaunch fast_lio mapping_avia.launch
roslaunch lane_detect lane_detect.launch     

(If you need to filter moving objects, use the M-detector and compile lane_detect_static.cpp)

rosbag play --pause --clock example.bag

Others

Collect ROSbag

We provide code for conveniently collecting static Livox data. Please refer to record_rosbag.cpp.

Acknowledgement

Part of the code references the implementation of the imaging liar place recognition and laneatt. We thank the authors for their awesome works!

Reference

@article{gao2026super,
  title={Super LiDAR Intensity for Robotic Perception},
  author={Gao, Wei and Zhang, Jie and Zhao, Mingle and Zhang, Zhiyuan and Kong, Shu and Ghaffari, Maani and Song, Dezhen and Xu, Chengzhong and Kong, Hui},
  journal={IEEE Robotics and Automation Letters},
  year={2026},
  publisher={IEEE}
}

About

This is the code repository for the IEEE-RAL'26 paper "Super LiDAR Intensity for Robotic Perception""

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors