Official code for ACM MM 2022 paper 'Progressive Limb-Aware Virtual Try-On'
We propose a novel progressive limb-aware virtual try-on framework named PL-VTON. PL-VTON consists of Multi-attribute Clothing Warping (MCW), Human Parsing Estimator (HPE), and Limb-aware Texture Fusion (LTF), which produces stable clothing deformation and handles the texture retention well in the final try-on result.
python 3.7
torch 1.9.0+cu111
torchvision 0.10.0+cu111
For the dataset, please refer to VITON.
-
Download the checkpoints from here.
-
Get VITON dataset.
-
Run the "test.py".
python test.pyNote that the results of our pretrained model are guaranteed in VITON dataset only.
The use of this code is restricted to non-commercial research and educational purposes.
- [ACM MM'22] PL-VTON - Progressive Limb-Aware Virtual Try-On
- [IEEE TMM'23] PL-VTONv2 - Limb-Aware Virtual Try-On Network With Progressive Clothing Warping
- [ACM MM'24] SCW-VTON - Shape-Guided Clothing Warping for Virtual Try-On
If you use our code or models, please cite with:
@inproceedings{han2022progressive,
title={Progressive Limb-Aware Virtual Try-On},
author={Han, Xiaoyu and Zhang, Shengping and Liu, Qinglin and Li, Zonglin and Wang, Chenyang},
booktitle={Proceedings of the 30th ACM International Conference on Multimedia},
pages={2420--2429},
year={2022}
}