Attentional Meta-Learners for Few-Shot Polythetic Classification
Abstract
Polythetic classifications, based on shared patterns of features that need neither be universal nor constant among members of a class, are common in the natural world and greatly outnumber monothetic classifications over a set of features. We show that threshold meta-learners, such as Prototypical Networks, require an embedding dimension that is exponential in the number of task-relevant features to emulate these functions. In contrast, attentional classifiers, such as Matching Networks, are polythetic by default and able to solve these problems with a linear embedding dimension. However, we find that in the presence of task-irrelevant features, inherent to meta-learning problems, attentional models are susceptible to misclassification. To address this challenge, we propose a self-attention feature-selection mechanism that adaptively dilutes non-discriminative features. We demonstrate the effectiveness of our approach in meta-learning Boolean functions, and synthetic and real-world few-shot learning tasks.
Cite
Text
Day et al. "Attentional Meta-Learners for Few-Shot Polythetic Classification." International Conference on Machine Learning, 2022.Markdown
[Day et al. "Attentional Meta-Learners for Few-Shot Polythetic Classification." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/day2022icml-attentional/)BibTeX
@inproceedings{day2022icml-attentional,
title = {{Attentional Meta-Learners for Few-Shot Polythetic Classification}},
author = {Day, Ben J and Torné, Ramon Viñas and Simidjievski, Nikola and Lió, Pietro},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {4867-4889},
volume = {162},
url = {https://mlanthology.org/icml/2022/day2022icml-attentional/}
}