A Nested Bi-Level Optimization Framework for Robust Few Shot Learning

Abstract

Model-Agnostic Meta-Learning (MAML), a popular gradient-based meta-learning framework, assumes that the contribution of each task or instance to the meta-learner is equal. Hence, it fails to address the domain shift between base and novel classes in few-shot learning. In this work, we propose a novel robust meta-learning algorithm, NESTEDMAML, which learns to assign weights to training tasks or instances. We consider weights as hyper-parameters and iteratively optimize them using a small set of validation tasks set in a nested bi-level optimization approach (in contrast to the standard bi-level optimization in MAML). We then apply NESTEDMAML in the meta-training stage, which involves (1) several tasks sampled from a distribution different from the meta-test task distribution, or (2) some data samples with noisy labels. Extensive experiments on synthetic and real-world datasets demonstrate that NESTEDMAML efficiently mitigates the effects of "unwanted" tasks or instances, leading to significant improvement over the state-of-the-art robust meta-learning methods.

Cite

Text

Killamsetty et al. "A Nested Bi-Level Optimization Framework for Robust Few Shot Learning." NeurIPS 2021 Workshops: MetaLearn, 2021.

Markdown

[Killamsetty et al. "A Nested Bi-Level Optimization Framework for Robust Few Shot Learning." NeurIPS 2021 Workshops: MetaLearn, 2021.](https://mlanthology.org/neuripsw/2021/killamsetty2021neuripsw-nested/)

BibTeX

@inproceedings{killamsetty2021neuripsw-nested,
  title     = {{A Nested Bi-Level Optimization Framework for Robust Few Shot Learning}},
  author    = {Killamsetty, Krishnateja and Li, Changbin and Zhao, Chen and Chen, Feng and Iyer, Rishabh K},
  booktitle = {NeurIPS 2021 Workshops: MetaLearn},
  year      = {2021},
  url       = {https://mlanthology.org/neuripsw/2021/killamsetty2021neuripsw-nested/}
}