Learning a Universal Template for Few-Shot Dataset Generalization

Abstract

Few-shot dataset generalization is a challenging variant of the well-studied few-shot classification problem where a diverse training set of several datasets is given, for the purpose of training an adaptable model that can then learn classes from \emph{new datasets} using only a few examples. To this end, we propose to utilize the diverse training set to construct a \emph{universal template}: a partial model that can define a wide array of dataset-specialized models, by plugging in appropriate components. For each new few-shot classification problem, our approach therefore only requires inferring a small number of parameters to insert into the universal template. We design a separate network that produces an initialization of those parameters for each given task, and we then fine-tune its proposed initialization via a few steps of gradient descent. Our approach is more parameter-efficient, scalable and adaptable compared to previous methods, and achieves the state-of-the-art on the challenging Meta-Dataset benchmark.

Cite

Text

Triantafillou et al. "Learning a Universal Template for Few-Shot Dataset Generalization." International Conference on Machine Learning, 2021.

Markdown

[Triantafillou et al. "Learning a Universal Template for Few-Shot Dataset Generalization." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/triantafillou2021icml-learning/)

BibTeX

@inproceedings{triantafillou2021icml-learning,
  title     = {{Learning a Universal Template for Few-Shot Dataset Generalization}},
  author    = {Triantafillou, Eleni and Larochelle, Hugo and Zemel, Richard and Dumoulin, Vincent},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {10424-10433},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/triantafillou2021icml-learning/}
}