Continual Deep Learning by Functional Regularisation of Memorable past

Abstract

Continually learning new skills is important for intelligent systems, yet standard deep learning methods suffer from catastrophic forgetting of the past. Recent works address this with weight regularisation. Functional regularisation, although computationally expensive, is expected to perform better, but rarely does so in practice. In this paper, we fix this issue by using a new functional-regularisation approach that utilises a few memorable past examples crucial to avoid forgetting. By using a Gaussian Process formulation of deep networks, our approach enables training in weight-space while identifying both the memorable past and a functional prior. Our method achieves state-of-the-art performance on standard benchmarks and opens a new direction for life-long learning where regularisation and memory-based methods are naturally combined.

Cite

Text

Pan et al. "Continual Deep Learning by Functional Regularisation of Memorable past." ICML 2020 Workshops: LifelongML, 2020.

Markdown

[Pan et al. "Continual Deep Learning by Functional Regularisation of Memorable past." ICML 2020 Workshops: LifelongML, 2020.](https://mlanthology.org/icmlw/2020/pan2020icmlw-continual/)

BibTeX

@inproceedings{pan2020icmlw-continual,
  title     = {{Continual Deep Learning by Functional Regularisation of Memorable past}},
  author    = {Pan, Pingbo and Swaroop, Siddharth and Immer, Alexander and Eschenhagen, Runa and Turner, Richard and Khan, Mohammad Emtiyaz},
  booktitle = {ICML 2020 Workshops: LifelongML},
  year      = {2020},
  url       = {https://mlanthology.org/icmlw/2020/pan2020icmlw-continual/}
}