The Role of Global Labels in Few-Shot Classification and How to Infer Them

Abstract

Few-shot learning is a central problem in meta-learning, where learners must quickly adapt to new tasks given limited training data. Recently, feature pre-training has become a ubiquitous component in state-of-the-art meta-learning methods and is shown to provide significant performance improvement. However, there is limited theoretical understanding of the connection between pre-training and meta-learning. Further, pre-training requires global labels shared across tasks, which may be unavailable in practice. In this paper, we show why exploiting pre-training is theoretically advantageous for meta-learning, and in particular the critical role of global labels. This motivates us to propose Meta Label Learning (MeLa), a novel meta-learning framework that automatically infers global labels to obtains robust few-shot models. Empirically, we demonstrate that MeLa is competitive with existing methods and provide extensive ablation experiments to highlight its key properties.

Cite

Text

Wang et al. "The Role of Global Labels in Few-Shot Classification and How to Infer Them." Neural Information Processing Systems, 2021.

Markdown

[Wang et al. "The Role of Global Labels in Few-Shot Classification and How to Infer Them." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/wang2021neurips-role/)

BibTeX

@inproceedings{wang2021neurips-role,
  title     = {{The Role of Global Labels in Few-Shot Classification and How to Infer Them}},
  author    = {Wang, Ruohan and Pontil, Massimiliano and Ciliberto, Carlo},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/wang2021neurips-role/}
}