Skip to content

zhuxing0/Ctrl-RS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

62 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Controllable Radar Simulation with Waveform Parameter Embedding

This repo contains the official code of our paper: Simulate Any Radar: Attribute-Controllable Radar Simulation via Waveform Parameter Embedding.

Authors: Weiqing Xiao*, Hao Huang*, Chonghao Zhong*, Yujie Lin, Nan Wang, Xiaoxue Chen, Zhaoxi Chen, Saining Zhang, Shuocheng Yang, Pierre Merriaux, Lei Lei, Hao Zhao†

✨ News

  • June 4, 2025: Release paper.

  • June 3, 2025: Release checkpoints of ICFAR-Net and downstream models.

  • June 1, 2025: Release project page.

πŸ“ TODO List

  • Release Camera-based pipeline.

πŸ“Š Overview

Overview

(a) SA-Radar (i.e., Ctrl-RS) enables controllable and realistic radar simulation by conditioning on customizable radar attributes. It supports flexible scene editing such as attribute modification, actor removal, and novel trajectories. (b) SA-Radar improves performance on various tasks including semantic segmentation, 2D/3D object detection. In all settings, SA-Radar’s synthetic data either matches or surpasses real data, and provides consistent gains when combined with real-world datasets.


  • From left to right are the RGB image and the range-azimuth slices of the radar cube for Ground-Truth, Simulation, Attribute Modification, Actor Removal, and Novel Trajectory.

gif1

  • Comparison of detection results between the model trained on SA-Radar simulation data and the baseline on real sequences.

gif2

  • Comparison of detection results between the model trained on SA-Radar simulation data and the baseline on simulated sequences.

gif3

πŸš€ Table of Contents

Environment Setup

First, create a new Conda environment and specify the Python version:

conda create -n SA_Radar python=3.11.9
conda activate SA_Radar
pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu121
pip install opencv-python
pip install scikit-image
pip install tensorboard==2.12.0
pip install matplotlib
pip install tqdm
pip install timm==0.5.4
pip install numpy==1.26.4

Dataset Preparation

Please keep the same directory tree as shown in RADDet dataset.

Download the dataset and arrange it as the folloing directory tree,

|-- train
	|-- RAD
		|-- part1
			|-- ******.npy
			|-- ******.npy
		|-- part2
			|-- ******.npy
			|-- ******.npy
	|-- gt
		|-- part1
			|-- ******.pickle
			|-- ******.pickle
		|-- part2
			|-- ******.pickle
			|-- ******.pickle
	|-- stereo_image
		|-- part1
			|-- ******.jpg
			|-- ******.jpg
		|-- part2
			|-- ******.jpg
			|-- ******.jpg
|-- test
	|-- RAD
		|-- part1
            |-- ***
	|-- gt
		|-- part1
            |-- ***
	|-- ***

Evaluate the pre-train models

python evaluate.py --restore_ckpt ./models/icfar-net.pth --attribute
or
python evaluate.py --restore_ckpt ./models/icfar-net_trained_by_A.pth --attribute
or
python evaluate.py --restore_ckpt ./models/icfar-net_trained_by_B.pth --attribute
or
python evaluate.py --restore_ckpt ./models/icfar-net_trained_by_C.pth --attribute

Train your model

prepare for the mixed dataset

python make_mixed_dataset_step1.py 
python make_mixed_dataset_step2.py 

train and eval your model on the mixed dataset

python train.py --logdir ./checkpoints/icfar_mixed_bs3_lr0.0002_50e --train_datasets raddet carrada raddet_by_mr --attribute --segment_mask_loss --l1_loss --sml1_loss
python evaluate.py --restore_ckpt ./checkpoints/icfar_mixed_bs3_lr0.0002_50e/icfar-net.pth --attribute

Run-radar-simulation

run radar simulation on RADDet train set

python demo.py --restore_ckpt ./models/icfar-net.pth --save_numpy --version train --attribute
or 
python demo.py --restore_ckpt ./checkpoints/icfar_mixed_bs3_lr0.0002_50e/icfar-net.pth --save_numpy --version train --attribute

run radar simulation on NuScene v1 mini

python demo_on_NuScenes.py --restore_ckpt ./models/icfar-net.pth --attribute --time_steps 100
or
python demo_on_NuScenes.py --restore_ckpt ./models/icfar-net_wo-re.pth --time_steps 100

Scene Editing

Attribute Modification

Modify the attribute_list in demo.py directly.

Novel Trajectories
python demo.py --restore_ckpt ./models/icfar-net.pth --save_numpy --version train --attribute --angle_rotation
Actor Removal
python demo.py --restore_ckpt ./models/icfar-net.pth --save_numpy --version train --attribute --remove

Downstream Tasks

We provide pre-trained weights of the models on different downstream tasks in the downstream_ckps folder, including real-data-trained, simulated-data-trained, and co-trained versions.

Running a Pre-trained Model

For 3D detection, copy the checkpoints from ./downstream_ckps/3d-det folder into RADDet_Pytorch and run:

# RAD head
python validate.py --config_dir ./configs/config.json --resume_from {model.pth}
or
# RA head
python validate_cart.py --config_dir ./configs/config.json --resume_from {model.pth}

For 2D detection (RD), copy the config and checkpoint files from ./downstream_ckps/2d-det (RTMDet Model on mmdet) folder into mmdetection, and run:

python tools/test.py --config configs/{config.py} --checkpoint work_dirs/{model.pth}

Training Your Downstream Model

After running demo.py, the generated simulation data is saved to the ./sim_output folder (set via --output_directory). The simulation data format is identical to the RADDet dataset, which you can use to train your downstream model.

🀝 Citation

If you find this repository helpful, please consider citing our paper:

@article{xiao2025simulate,
  title={Simulate Any Radar: Attribute-Controllable Radar Simulation via Waveform Parameter Embedding},
  author={Xiao, Weiqing and Huang, Hao and Zhong, Chonghao and Lin, Yujie and Wang, Nan and Chen, Xiaoxue and Chen, Zhaoxi and Zhang, Saining and Yang, Shuocheng and Merriaux, Pierre and others},
  journal={arXiv preprint arXiv:2506.03134},
  year={2025}
}

About

[CVPR 2026 Findings] The official code of the paper "Simulate Any Radar: Attribute-Controllable Radar Simulation via Waveform Parameter Embedding".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors