The GAN Is Dead; Long Live the GAN! a Modern Baseline GAN

Abstract

There is a widely-spread claim that GANs are difficult to train, and GAN architectures in the literature are littered with empirical tricks. We provide evidence against this claim and build a modern GAN baseline in a more principled manner. First, we derive a well-behaved regularized relativistic GAN loss that addresses issues of mode dropping and non-convergence that were previously tackled via a bag of ad-hoc tricks. We analyze our loss mathematically and prove that it admits local convergence guarantees, unlike most existing relativistic losses. Second, our new loss allows us to discard all ad-hoc tricks and replace outdated backbones used in common GANs with modern architectures. Using StyleGAN2 as an example, we present a roadmap of simplification and modernization that results in a new minimalist baseline---\modelName. Despite being simple, our approach surpasses StyleGAN2 on FFHQ, ImageNet, CIFAR, and Stacked MNIST datasets, and compares favorably against state-of-the-art GANs and diffusion models.

Cite

Text

Huang et al. "The GAN Is Dead; Long Live the GAN! a Modern Baseline GAN." ICML 2024 Workshops: SPIGM, 2024.

Markdown

[Huang et al. "The GAN Is Dead; Long Live the GAN! a Modern Baseline GAN." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/huang2024icmlw-gan/)

BibTeX

@inproceedings{huang2024icmlw-gan,
  title     = {{The GAN Is Dead; Long Live the GAN! a Modern Baseline GAN}},
  author    = {Huang, Nick and Gokaslan, Aaron and Kuleshov, Volodymyr and Tompkin, James},
  booktitle = {ICML 2024 Workshops: SPIGM},
  year      = {2024},
  url       = {https://mlanthology.org/icmlw/2024/huang2024icmlw-gan/}
}