XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage

Abstract

Meta-learning, which pursues an effective initialization model, has emerged as a promising approach to handling unseen tasks. However, a limitation remains to be evident when a meta-learner tries to encompass a wide range of task distribution, e.g., learning across distinctive datasets or domains. Recently, a group of works has attempted to employ multiple model initializations to cover widely-ranging tasks, but they are limited in adaptively expanding initializations. We introduce XB-MAML, which learns expandable basis parameters, where they are linearly combined to form an effective initialization to a given task. XB-MAML observes the discrepancy between the vector space spanned by the basis and fine-tuned parameters to decide whether to expand the basis. Our method surpasses the existing works in the multi-domain meta-learning benchmarks and opens up new chances of meta-learning for obtaining the diverse inductive bias that can be combined to stretch toward the effective initialization for diverse unseen tasks.

Cite

Text

Lee and Whan Yoon. "XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage." Artificial Intelligence and Statistics, 2024.

Markdown

[Lee and Whan Yoon. "XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/lee2024aistats-xbmaml/)

BibTeX

@inproceedings{lee2024aistats-xbmaml,
  title     = {{XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage}},
  author    = {Lee, Jae-Jun and Whan Yoon, Sung},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2024},
  pages     = {3196-3204},
  volume    = {238},
  url       = {https://mlanthology.org/aistats/2024/lee2024aistats-xbmaml/}
}