Good Semi-Supervised Learning That Requires a Bad GAN
Abstract
Semi-supervised learning methods based on generative adversarial networks (GANs) obtained strong empirical results, but it is not clear 1) how the discriminator benefits from joint training with a generator, and 2) why good semi-supervised classification performance and a good generator cannot be obtained at the same time. Theoretically we show that given the discriminator objective, good semi-supervised learning indeed requires a bad generator, and propose the definition of a preferred generator. Empirically, we derive a novel formulation based on our analysis that substantially improves over feature matching GANs, obtaining state-of-the-art results on multiple benchmark datasets.
Cite
Text
Dai et al. "Good Semi-Supervised Learning That Requires a Bad GAN." Neural Information Processing Systems, 2017.Markdown
[Dai et al. "Good Semi-Supervised Learning That Requires a Bad GAN." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/dai2017neurips-good/)BibTeX
@inproceedings{dai2017neurips-good,
title = {{Good Semi-Supervised Learning That Requires a Bad GAN}},
author = {Dai, Zihang and Yang, Zhilin and Yang, Fan and Cohen, William W. and Salakhutdinov, Ruslan},
booktitle = {Neural Information Processing Systems},
year = {2017},
pages = {6510-6520},
url = {https://mlanthology.org/neurips/2017/dai2017neurips-good/}
}