Self-Supervised Set Representation Learning for Unsupervised Meta-Learning
Abstract
Unsupervised meta-learning (UML) essentially shares the spirit of self-supervised learning (SSL) in that their goal aims at learning models without any human supervision so that the models can be adapted to downstream tasks. Further, the learning objective of self-supervised learning, which pulls positive pairs closer and repels negative pairs, also resembles metric-based meta-learning. Metric-based meta-learning is one of the most successful meta-learning methods, which learns to minimize the distance between representations from the same class. One notable aspect of metric-based meta-learning, however, is that it is widely interpreted as a set-level problem since the inference of discriminative class prototypes (or set representations) from few examples is crucial for the performance of downstream tasks. Motivated by this, we propose Set-SimCLR, a novel self-supervised set representation learning framework for targeting UML problem. Specifically, our Set-SimCLR learns a set encoder on top of instance representations to maximize the agreement between two sets of augmented samples, which are generated by applying stochastic augmentations to a given image. We theoretically analyze how our proposed set representation learning can potentially improve the generalization performance at the meta-test. We also empirically validate its effectiveness on various benchmark datasets, showing that Set-SimCLR largely outperforms both UML and instance-level self-supervised learning baselines.
Cite
Text
Lee et al. "Self-Supervised Set Representation Learning for Unsupervised Meta-Learning." International Conference on Learning Representations, 2023.Markdown
[Lee et al. "Self-Supervised Set Representation Learning for Unsupervised Meta-Learning." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/lee2023iclr-selfsupervised/)BibTeX
@inproceedings{lee2023iclr-selfsupervised,
title = {{Self-Supervised Set Representation Learning for Unsupervised Meta-Learning}},
author = {Lee, Dong Bok and Lee, Seanie and Kawaguchi, Kenji and Kim, Yunji and Bang, Jihwan and Ha, Jung-Woo and Hwang, Sung Ju},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/lee2023iclr-selfsupervised/}
}