This repo contains the official code of our paper: Simulate Any Radar: Attribute-Controllable Radar Simulation via Waveform Parameter Embedding.
Authors: Weiqing Xiao*, Hao Huang*, Chonghao Zhong*, Yujie Lin, Nan Wang, Xiaoxue Chen, Zhaoxi Chen, Saining Zhang, Shuocheng Yang, Pierre Merriaux, Lei Lei, Hao Zhaoβ
-
June 4, 2025: Release paper.
-
June 3, 2025: Release checkpoints of ICFAR-Net and downstream models.
-
June 1, 2025: Release project page.
- Release
Camera-based pipeline.
(a) SA-Radar (i.e., Ctrl-RS) enables controllable and realistic radar simulation by conditioning on customizable radar attributes. It supports flexible scene editing such as attribute modification, actor removal, and novel trajectories. (b) SA-Radar improves performance on various tasks including semantic segmentation, 2D/3D object detection. In all settings, SA-Radarβs synthetic data either matches or surpasses real data, and provides consistent gains when combined with real-world datasets.
- From left to right are the RGB image and the range-azimuth slices of the radar cube for Ground-Truth, Simulation, Attribute Modification, Actor Removal, and Novel Trajectory.
- Comparison of detection results between the model trained on SA-Radar simulation data and the baseline on real sequences.
- Comparison of detection results between the model trained on SA-Radar simulation data and the baseline on simulated sequences.
- Environment Setup
- Dataset Preparation
- Evaluate the Pre-train Model
- Train Your Model
- Run Radar Simulation
- Downstream Tasks
First, create a new Conda environment and specify the Python version:
conda create -n SA_Radar python=3.11.9
conda activate SA_Radar
pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu121
pip install opencv-python
pip install scikit-image
pip install tensorboard==2.12.0
pip install matplotlib
pip install tqdm
pip install timm==0.5.4
pip install numpy==1.26.4Please keep the same directory tree as shown in RADDet dataset.
Download the dataset and arrange it as the folloing directory tree,
|-- train
|-- RAD
|-- part1
|-- ******.npy
|-- ******.npy
|-- part2
|-- ******.npy
|-- ******.npy
|-- gt
|-- part1
|-- ******.pickle
|-- ******.pickle
|-- part2
|-- ******.pickle
|-- ******.pickle
|-- stereo_image
|-- part1
|-- ******.jpg
|-- ******.jpg
|-- part2
|-- ******.jpg
|-- ******.jpg
|-- test
|-- RAD
|-- part1
|-- ***
|-- gt
|-- part1
|-- ***
|-- ***python evaluate.py --restore_ckpt ./models/icfar-net.pth --attribute
or
python evaluate.py --restore_ckpt ./models/icfar-net_trained_by_A.pth --attribute
or
python evaluate.py --restore_ckpt ./models/icfar-net_trained_by_B.pth --attribute
or
python evaluate.py --restore_ckpt ./models/icfar-net_trained_by_C.pth --attributepython make_mixed_dataset_step1.py
python make_mixed_dataset_step2.py python train.py --logdir ./checkpoints/icfar_mixed_bs3_lr0.0002_50e --train_datasets raddet carrada raddet_by_mr --attribute --segment_mask_loss --l1_loss --sml1_loss
python evaluate.py --restore_ckpt ./checkpoints/icfar_mixed_bs3_lr0.0002_50e/icfar-net.pth --attributepython demo.py --restore_ckpt ./models/icfar-net.pth --save_numpy --version train --attribute
or
python demo.py --restore_ckpt ./checkpoints/icfar_mixed_bs3_lr0.0002_50e/icfar-net.pth --save_numpy --version train --attributepython demo_on_NuScenes.py --restore_ckpt ./models/icfar-net.pth --attribute --time_steps 100
or
python demo_on_NuScenes.py --restore_ckpt ./models/icfar-net_wo-re.pth --time_steps 100Modify the attribute_list in demo.py directly.
python demo.py --restore_ckpt ./models/icfar-net.pth --save_numpy --version train --attribute --angle_rotationpython demo.py --restore_ckpt ./models/icfar-net.pth --save_numpy --version train --attribute --removeWe provide pre-trained weights of the models on different downstream tasks in the downstream_ckps folder, including real-data-trained, simulated-data-trained, and co-trained versions.
For 3D detection, copy the checkpoints from ./downstream_ckps/3d-det folder into RADDet_Pytorch and run:
# RAD head
python validate.py --config_dir ./configs/config.json --resume_from {model.pth}
or
# RA head
python validate_cart.py --config_dir ./configs/config.json --resume_from {model.pth}For 2D detection (RD), copy the config and checkpoint files from ./downstream_ckps/2d-det (RTMDet Model on mmdet) folder into mmdetection, and run:
python tools/test.py --config configs/{config.py} --checkpoint work_dirs/{model.pth}After running demo.py, the generated simulation data is saved to the ./sim_output folder (set via --output_directory). The simulation data format is identical to the RADDet dataset, which you can use to train your downstream model.
If you find this repository helpful, please consider citing our paper:
@article{xiao2025simulate,
title={Simulate Any Radar: Attribute-Controllable Radar Simulation via Waveform Parameter Embedding},
author={Xiao, Weiqing and Huang, Hao and Zhong, Chonghao and Lin, Yujie and Wang, Nan and Chen, Xiaoxue and Chen, Zhaoxi and Zhang, Saining and Yang, Shuocheng and Merriaux, Pierre and others},
journal={arXiv preprint arXiv:2506.03134},
year={2025}
}


