Few-Shot Learning with Part Discovery and Augmentation from Unlabeled Images
Abstract
Few-shot learning is a challenging task since only few instances are given for recognizing an unseen class. One way to alleviate this problem is to acquire a strong inductive bias via meta-learning on similar tasks. In this paper, we show that such inductive bias can be learned from a flat collection of unlabeled images, and instantiated as transferable representations among seen and unseen classes. Specifically, we propose a novel part-based self-supervised representation learning scheme to learn transferable representations by maximizing the similarity of an image to its discriminative part. To mitigate the overfitting in few-shot classification caused by data scarcity, we further propose a part augmentation strategy by retrieving extra images from a base dataset. We conduct systematic studies on miniImageNet and tieredImageNet benchmarks. Remarkably, our method yields impressive results, outperforming the previous best unsupervised methods by 7.74% and 9.24% under 5-way 1-shot and 5-way 5-shot settings, which are comparable with state-of-the-art supervised methods.
Cite
Text
Chen et al. "Few-Shot Learning with Part Discovery and Augmentation from Unlabeled Images." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/313Markdown
[Chen et al. "Few-Shot Learning with Part Discovery and Augmentation from Unlabeled Images." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/chen2021ijcai-few/) doi:10.24963/IJCAI.2021/313BibTeX
@inproceedings{chen2021ijcai-few,
title = {{Few-Shot Learning with Part Discovery and Augmentation from Unlabeled Images}},
author = {Chen, Wentao and Si, Chenyang and Wang, Wei and Wang, Liang and Wang, Zilei and Tan, Tieniu},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2021},
pages = {2271-2277},
doi = {10.24963/IJCAI.2021/313},
url = {https://mlanthology.org/ijcai/2021/chen2021ijcai-few/}
}