Semi-Supervised Domain Generalization with Stochastic StyleMatch
Abstract
We study semi-supervised domain generalization (SSDG), a more realistic problem setting than existing domain generalization research. In particular, SSDG assumes only a few data are labeled from each source domain, along with abundant unlabeled data. Our proposed approach, called StyleMatch, extends FixMatch's two-view consistency learning paradigm in two crucial ways to address SSDG: first, stochastic modeling is applied to the classifier's weights to mitigate overfitting in the scarce labeled data; and second, style augmentation is integrated as a third view into the multi-view consistency learning framework to enhance robustness to domain shift. Two SSDG benchmarks are established where StyleMatch outperforms strong baseline methods developed in relevant areas including domain generalization and semi-supervised learning.
Cite
Text
Zhou et al. "Semi-Supervised Domain Generalization with Stochastic StyleMatch." NeurIPS 2021 Workshops: DistShift, 2021.Markdown
[Zhou et al. "Semi-Supervised Domain Generalization with Stochastic StyleMatch." NeurIPS 2021 Workshops: DistShift, 2021.](https://mlanthology.org/neuripsw/2021/zhou2021neuripsw-semisupervised/)BibTeX
@inproceedings{zhou2021neuripsw-semisupervised,
title = {{Semi-Supervised Domain Generalization with Stochastic StyleMatch}},
author = {Zhou, Kaiyang and Loy, Chen Change and Liu, Ziwei},
booktitle = {NeurIPS 2021 Workshops: DistShift},
year = {2021},
url = {https://mlanthology.org/neuripsw/2021/zhou2021neuripsw-semisupervised/}
}