Bayesian Metaplasticity from Synaptic Uncertainty

Abstract

Catastrophic forgetting remains a challenge for neural networks, especially in lifelong learning scenarios. In this study, we introduce MEtaplasticity from Synaptic Uncertainty (MESU), inspired by metaplasticity and Bayesian inference principles. MESU harnesses synaptic uncertainty to retain information over time, with its update rule closely approximating the diagonal Newton's method for synaptic updates. Through continual learning experiments on permuted MNIST tasks, we demonstrate MESU's remarkable capability to maintain learning performance across 100 tasks without the need of explicit task boundaries.

Cite

Text

Bonnet et al. "Bayesian Metaplasticity from Synaptic Uncertainty." NeurIPS 2023 Workshops: MLNCP, 2023.

Markdown

[Bonnet et al. "Bayesian Metaplasticity from Synaptic Uncertainty." NeurIPS 2023 Workshops: MLNCP, 2023.](https://mlanthology.org/neuripsw/2023/bonnet2023neuripsw-bayesian/)

BibTeX

@inproceedings{bonnet2023neuripsw-bayesian,
  title     = {{Bayesian Metaplasticity from Synaptic Uncertainty}},
  author    = {Bonnet, Djohan and Hirtzlin, Tifenn and Januel, Tarcisius and Dalgaty, Thomas and Querlioz, Damien and Vianello, Elisa},
  booktitle = {NeurIPS 2023 Workshops: MLNCP},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/bonnet2023neuripsw-bayesian/}
}