La-MAML: Look-Ahead Meta Learning for Continual Learning

Abstract

The continual learning problem involves training models with limited capacity to perform well on a set of an unknown number of sequentially arriving tasks. While meta-learning shows great potential for reducing interference between old and new tasks, the current training procedures tend to be either slow or offline, and sensitive to many hyper-parameters. In this work, we propose Look-ahead MAML (La-MAML), a fast optimisation-based meta-learning algorithm for online-continual learning, aided by a small episodic memory. By incorporating the modulation of per-parameter learning rates in our meta-learning update, our approach also allows us to draw connections to and exploit prior work on hypergradients and meta-descent. This provides a more flexible and efficient way to mitigate catastrophic forgetting compared to conventional prior-based methods. La-MAML achieves performance superior to other replay-based, prior-based and meta-learning based approaches for continual learning on real-world visual classification benchmarks.

Cite

Text

Gupta et al. "La-MAML: Look-Ahead Meta Learning for Continual Learning." ICML 2020 Workshops: LifelongML, 2020.

Markdown

[Gupta et al. "La-MAML: Look-Ahead Meta Learning for Continual Learning." ICML 2020 Workshops: LifelongML, 2020.](https://mlanthology.org/icmlw/2020/gupta2020icmlw-lamaml/)

BibTeX

@inproceedings{gupta2020icmlw-lamaml,
  title     = {{La-MAML: Look-Ahead Meta Learning for Continual Learning}},
  author    = {Gupta, Gunshi and Yadav, Karmesh and Paull, Liam},
  booktitle = {ICML 2020 Workshops: LifelongML},
  year      = {2020},
  url       = {https://mlanthology.org/icmlw/2020/gupta2020icmlw-lamaml/}
}