Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace

Abstract

Gradient-based meta-learning methods leverage gradient descent to learn the commonalities among various tasks. While previous such methods have been successful in meta-learning tasks, they resort to simple gradient descent during meta-testing. Our primary contribution is the MT-net, which enables the meta-learner to learn on each layer’s activation space a subspace that the task-specific learner performs gradient descent on. Additionally, a task-specific learner of an MT-net performs gradient descent with respect to a meta-learned distance metric, which warps the activation space to be more sensitive to task identity. We demonstrate that the dimension of this learned subspace reflects the complexity of the task-specific learner’s adaptation task, and also that our model is less sensitive to the choice of initial learning rates than previous gradient-based meta-learning methods. Our method achieves state-of-the-art or comparable performance on few-shot classification and regression tasks.

Cite

Text

Lee and Choi. "Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace." International Conference on Machine Learning, 2018.

Markdown

[Lee and Choi. "Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/lee2018icml-gradientbased/)

BibTeX

@inproceedings{lee2018icml-gradientbased,
  title     = {{Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace}},
  author    = {Lee, Yoonho and Choi, Seungjin},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {2927-2936},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/lee2018icml-gradientbased/}
}