Skip to content

xyhanHIT/PL-VTON

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Progressive Limb-Aware Virtual Try-On, ACM MM'22.

Official code for ACM MM 2022 paper 'Progressive Limb-Aware Virtual Try-On'

We propose a novel progressive limb-aware virtual try-on framework named PL-VTON. PL-VTON consists of Multi-attribute Clothing Warping (MCW), Human Parsing Estimator (HPE), and Limb-aware Texture Fusion (LTF), which produces stable clothing deformation and handles the texture retention well in the final try-on result.

[Paper]

[Checkpoints]

Pipeline

image

Environment

python 3.7

torch 1.9.0+cu111

torchvision 0.10.0+cu111

Dataset

For the dataset, please refer to VITON.

Inference

  1. Download the checkpoints from here.

  2. Get VITON dataset.

  3. Run the "test.py".

python test.py

Note that the results of our pretrained model are guaranteed in VITON dataset only.

License

The use of this code is restricted to non-commercial research and educational purposes.

Our Team's Researches

Citation

If you use our code or models, please cite with:

@inproceedings{han2022progressive,
  title={Progressive Limb-Aware Virtual Try-On},
  author={Han, Xiaoyu and Zhang, Shengping and Liu, Qinglin and Li, Zonglin and Wang, Chenyang},
  booktitle={Proceedings of the 30th ACM International Conference on Multimedia},
  pages={2420--2429},
  year={2022}
}

About

ACM MM 2022 - Progressive Limb-Aware Virtual Try-On

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages