Unsupervised Meta-Learning via Few-Shot Pseudo-Supervised Contrastive Learning

Abstract

Unsupervised meta-learning aims to learn generalizable knowledge across a distribution of tasks constructed from unlabeled data. Here, the main challenge is how to construct diverse tasks for meta-learning without label information; recent works have proposed to create, e.g., pseudo-labeling via pretrained representations or creating synthetic samples via generative models. However, such a task construction strategy is fundamentally limited due to heavy reliance on the immutable pseudo-labels during meta-learning and the quality of the representations or the generated samples. To overcome the limitations, we propose a simple yet effective unsupervised meta-learning framework, coined Pseudo-supervised Contrast (PsCo), for few-shot classification. We are inspired by the recent self-supervised learning literature; PsCo utilizes a momentum network and a queue of previous batches to improve pseudo-labeling and construct diverse tasks in a progressive manner. Our extensive experiments demonstrate that PsCo outperforms existing unsupervised meta-learning methods under various in-domain and cross-domain few-shot classification benchmarks. We also validate that PsCo is easily scalable to a large-scale benchmark, while recent prior-art meta-schemes are not.

Cite

Text

Jang et al. "Unsupervised Meta-Learning via Few-Shot Pseudo-Supervised Contrastive Learning." International Conference on Learning Representations, 2023.

Markdown

[Jang et al. "Unsupervised Meta-Learning via Few-Shot Pseudo-Supervised Contrastive Learning." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/jang2023iclr-unsupervised/)

BibTeX

@inproceedings{jang2023iclr-unsupervised,
  title     = {{Unsupervised Meta-Learning via Few-Shot Pseudo-Supervised Contrastive Learning}},
  author    = {Jang, Huiwon and Lee, Hankook and Shin, Jinwoo},
  booktitle = {International Conference on Learning Representations},
  year      = {2023},
  url       = {https://mlanthology.org/iclr/2023/jang2023iclr-unsupervised/}
}