Skip to content

[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang

License

Notifications You must be signed in to change notification settings

xiangyu918/TransGAN

 
 

Repository files navigation

TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up

Code used for TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up.

Implementation

  • checkpoint gradient using torch.utils.checkpoint
  • 16bit precision training
  • Distributed Training (Faster!)
  • IS/FID Evaluation
  • Gradient Accumulation
  • Stronger Data Augmentation
  • Self-Modulation

Guidance

Cifar training script

python exp/cifar_train.py

Cifar test

First download the cifar checkpoint and put it on ./cifar_checkpoint. Then run the following script.

python exp/cifar_test.py

Main Pipeline

Main Pipeline

Representative Visual Results

Cifar Visual Results Visual Results

README waits for updated

Acknowledgement

Codebase from AutoGAN, pytorch-image-models

Citation

if you find this repo is helpful, please cite

@article{jiang2021transgan,
  title={TransGAN: Two Transformers Can Make One Strong GAN},
  author={Jiang, Yifan and Chang, Shiyu and Wang, Zhangyang},
  journal={arXiv preprint arXiv:2102.07074},
  year={2021}
}

About

[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 91.3%
  • Cuda 5.0%
  • C++ 2.2%
  • Shell 1.5%