Evolution of Discriminator and Generator Gradients in GAN Training: From Fitting to Collapse
Abstract
Generative Adversarial Networks (GANs) are powerful generative models but often suffer from mode mixture and mode collapse. We propose a perspective that views GAN training as a two-phase progression from fitting to collapse, where mode mixture and mode collapse are treated as inter-connected. Inspired by the particle model interpretation of GANs, we leverage the discriminator gradient to analyze particle movement and the generator gradient, specifically "steepness," to quantify the severity of mode mixture by measuring the generator's sensitivity to changes in the latent space. Using these theoretical insights into evolution of gradients, we design a specialized metric that integrates both gradients to detect the transition from fitting to collapse. This metric forms the basis of an early stopping algorithm, which stops training at a point that retains sample quality and diversity. Experiments on synthetic and real-world datasets, including MNIST, Fashion MNIST, and CIFAR-10, validate our theoretical findings and demonstrate the effectiveness of the proposed algorithm.
Cite
Text
Gao and Li. "Evolution of Discriminator and Generator Gradients in GAN Training: From Fitting to Collapse." Transactions on Machine Learning Research, 2025.Markdown
[Gao and Li. "Evolution of Discriminator and Generator Gradients in GAN Training: From Fitting to Collapse." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/gao2025tmlr-evolution/)BibTeX
@article{gao2025tmlr-evolution,
title = {{Evolution of Discriminator and Generator Gradients in GAN Training: From Fitting to Collapse}},
author = {Gao, Weiguo and Li, Ming},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/gao2025tmlr-evolution/}
}