Task Similarity Aware Meta Learning: Theory-Inspired Improvement on MAML
Abstract
Few-shot learning ability is heavily desired for machine intelligence. By meta-learning a model initialization from training tasks with fast adaptation ability to new tasks, model-agnostic meta-learning (MAML) has achieved remarkable success in a number of few-shot learning applications. However, theoretical understandings on the learning ability of MAML remain absent yet, hindering developing new and more advanced meta learning methods in a principled way. In this work, we solve this problem by theoretically justifying the fast adaptation capability of MAML when applied to new tasks. Specifically, we prove that the learnt meta-initialization can benefit the fast adaptation to new tasks with only a few steps of gradient descent. This result explicitly reveals the benefits of the unique designs in MAML. Then we propose a theory-inspired task similarity aware MAML which clusters tasks into multiple groups according to the estimated optimal model parameters and learns group-specific initializations. The proposed method improves upon MAML by speeding up the adaptation and giving stronger few-shot learning ability. Experimental results on the few-shot classification tasks testify its advantages.
Cite
Text
Zhou et al. "Task Similarity Aware Meta Learning: Theory-Inspired Improvement on MAML." Uncertainty in Artificial Intelligence, 2021.Markdown
[Zhou et al. "Task Similarity Aware Meta Learning: Theory-Inspired Improvement on MAML." Uncertainty in Artificial Intelligence, 2021.](https://mlanthology.org/uai/2021/zhou2021uai-task/)BibTeX
@inproceedings{zhou2021uai-task,
title = {{Task Similarity Aware Meta Learning: Theory-Inspired Improvement on MAML}},
author = {Zhou, Pan and Zou, Yingtian and Yuan, Xiao-Tong and Feng, Jiashi and Xiong, Caiming and Hoi, Steven},
booktitle = {Uncertainty in Artificial Intelligence},
year = {2021},
pages = {23-33},
volume = {161},
url = {https://mlanthology.org/uai/2021/zhou2021uai-task/}
}