A Simple Recipe to Meta-Learn Forward and Backward Transfer

Abstract

Meta-learning holds the potential to provide a general and explicit solution to tackle interference and forgetting in continual learning. However, many popular algorithms introduce expensive and unstable optimization processes with new key hyper-parameters and requirements, hindering their applicability. We propose a new, general, and simple meta-learning algorithm for continual learning (SiM4C) that explicitly optimizes to minimize forgetting and facilitate forward transfer. We show our method is stable, introduces only minimal computational overhead, and can be integrated with any memory-based continual learning algorithm in only a few lines of code. SiM4C meta-learns how to effectively continually learn even on very long task sequences, largely outperforming prior meta-approaches. Naively integrating with existing memory-based algorithms, we also record universal performance benefits and state-of-the-art results across different visual classification benchmarks without introducing new hyper-parameters.

Cite

Text

Cetin et al. "A Simple Recipe to Meta-Learn Forward and Backward Transfer." International Conference on Computer Vision, 2023. doi:10.1109/ICCV51070.2023.01717

Markdown

[Cetin et al. "A Simple Recipe to Meta-Learn Forward and Backward Transfer." International Conference on Computer Vision, 2023.](https://mlanthology.org/iccv/2023/cetin2023iccv-simple/) doi:10.1109/ICCV51070.2023.01717

BibTeX

@inproceedings{cetin2023iccv-simple,
  title     = {{A Simple Recipe to Meta-Learn Forward and Backward Transfer}},
  author    = {Cetin, Edoardo and Carta, Antonio and Celiktutan, Oya},
  booktitle = {International Conference on Computer Vision},
  year      = {2023},
  pages     = {18732-18742},
  doi       = {10.1109/ICCV51070.2023.01717},
  url       = {https://mlanthology.org/iccv/2023/cetin2023iccv-simple/}
}