On the Subspace Structure of Gradient-Based Meta-Learning

Abstract

In this work we provide an analysis of the distribution of the post-adaptation parameters of Gradient-Based Meta-Learning (GBML) methods. Previous work has noticed how, for the case of image-classification, this adaption only takes place on the last layers of the network. We propose the more general notion that parameters are updated over a low-dimensional subspace of the same dimensionality as the task-space and show that this holds for regression as well. Furthermore, the induced subspace structure provides a method to estimate the intrinsic dimension of the space of tasks of common few-shot learning datasets.

Cite

Text

Tegnér et al. "On the Subspace Structure of Gradient-Based Meta-Learning." ICML 2022 Workshops: Pre-Training, 2022.

Markdown

[Tegnér et al. "On the Subspace Structure of Gradient-Based Meta-Learning." ICML 2022 Workshops: Pre-Training, 2022.](https://mlanthology.org/icmlw/2022/tegner2022icmlw-subspace/)

BibTeX

@inproceedings{tegner2022icmlw-subspace,
  title     = {{On the Subspace Structure of Gradient-Based Meta-Learning}},
  author    = {Tegnér, Gustaf and Reichlin, Alfredo and Yin, Hang and Björkman, Mårten and Jensfelt, Danica Kragic},
  booktitle = {ICML 2022 Workshops: Pre-Training},
  year      = {2022},
  url       = {https://mlanthology.org/icmlw/2022/tegner2022icmlw-subspace/}
}