Diffusion Models and Semi-Supervised Learners Benefit Mutually with Few Labels
Abstract
In an effort to further advance semi-supervised generative and classification tasks, we propose a simple yet effective training strategy called *dual pseudo training* (DPT), built upon strong semi-supervised learners and diffusion models. DPT operates in three stages: training a classifier on partially labeled data to predict pseudo-labels; training a conditional generative model using these pseudo-labels to generate pseudo images; and retraining the classifier with a mix of real and pseudo images. Empirically, DPT consistently achieves SOTA performance of semi-supervised generation and classification across various settings. In particular, with one or two labels per class, DPT achieves a Fréchet Inception Distance (FID) score of 3.08 or 2.52 on ImageNet $256\times256$. Besides, DPT outperforms competitive semi-supervised baselines substantially on ImageNet classification tasks, *achieving top-1 accuracies of 59.0 (+2.8), 69.5 (+3.0), and 74.4 (+2.0)* with one, two, or five labels per class, respectively. Notably, our results demonstrate that diffusion can generate realistic images with only a few labels (e.g., $<0.1$%) and generative augmentation remains viable for semi-supervised classification. Our code is available at *https://github.com/ML-GSAI/DPT*.
Cite
Text
You et al. "Diffusion Models and Semi-Supervised Learners Benefit Mutually with Few Labels." Neural Information Processing Systems, 2023.Markdown
[You et al. "Diffusion Models and Semi-Supervised Learners Benefit Mutually with Few Labels." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/you2023neurips-diffusion/)BibTeX
@inproceedings{you2023neurips-diffusion,
title = {{Diffusion Models and Semi-Supervised Learners Benefit Mutually with Few Labels}},
author = {You, Zebin and Zhong, Yong and Bao, Fan and Sun, Jiacheng and Li, Chongxuan and Zhu, Jun},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/you2023neurips-diffusion/}
}