Online Structured Laplace Approximations for Overcoming Catastrophic Forgetting

Abstract

We introduce the Kronecker factored online Laplace approximation for overcoming catastrophic forgetting in neural networks. The method is grounded in a Bayesian online learning framework, where we recursively approximate the posterior after every task with a Gaussian, leading to a quadratic penalty on changes to the weights. The Laplace approximation requires calculating the Hessian around a mode, which is typically intractable for modern architectures. In order to make our method scalable, we leverage recent block-diagonal Kronecker factored approximations to the curvature. Our algorithm achieves over 90% test accuracy across a sequence of 50 instantiations of the permuted MNIST dataset, substantially outperforming related methods for overcoming catastrophic forgetting.

Cite

Text

Ritter et al. "Online Structured Laplace Approximations for Overcoming Catastrophic Forgetting." Neural Information Processing Systems, 2018.

Markdown

[Ritter et al. "Online Structured Laplace Approximations for Overcoming Catastrophic Forgetting." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/ritter2018neurips-online/)

BibTeX

@inproceedings{ritter2018neurips-online,
  title     = {{Online Structured Laplace Approximations for Overcoming Catastrophic Forgetting}},
  author    = {Ritter, Hippolyt and Botev, Aleksandar and Barber, David},
  booktitle = {Neural Information Processing Systems},
  year      = {2018},
  pages     = {3738-3748},
  url       = {https://mlanthology.org/neurips/2018/ritter2018neurips-online/}
}