UniGAN: Reducing Mode Collapse in GANs Using a Uniform Generator
Abstract
Despite the significant progress that has been made in the training of Generative Adversarial Networks (GANs), the mode collapse problem remains a major challenge in training GANs, which refers to a lack of diversity in generative samples. In this paper, we propose a new type of generative diversity named uniform diversity, which relates to a newly proposed type of mode collapse named $u$-mode collapse where the generative samples distribute nonuniformly over the data manifold. From a geometric perspective, we show that the uniform diversity is closely related with the generator uniformity property, and the maximum uniform diversity is achieved if the generator is uniform. To learn a uniform generator, we propose UniGAN, a generative framework with a Normalizing Flow based generator and a simple yet sample efficient generator uniformity regularization, which can be easily adapted to any other generative framework. A new type of diversity metric named udiv is also proposed to estimate the uniform diversity given a set of generative samples in practice. Experimental results verify the effectiveness of our UniGAN in learning a uniform generator and improving uniform diversity.
Cite
Text
Pan et al. "UniGAN: Reducing Mode Collapse in GANs Using a Uniform Generator." Neural Information Processing Systems, 2022.Markdown
[Pan et al. "UniGAN: Reducing Mode Collapse in GANs Using a Uniform Generator." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/pan2022neurips-unigan/)BibTeX
@inproceedings{pan2022neurips-unigan,
title = {{UniGAN: Reducing Mode Collapse in GANs Using a Uniform Generator}},
author = {Pan, Ziqi and Niu, Li and Zhang, Liqing},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/pan2022neurips-unigan/}
}