Energy-Based Models for Continual Learning

Abstract

We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs have a natural way to support a dynamically-growing number of tasks or classes that causes less interference with previously learned information. Our proposed version of EBMs for continual learning is simple, efficient and outperforms baseline methods by a large margin on several benchmarks. Moreover, our proposed contrastive divergence based training objective can be applied to other continual learning methods, resulting in substantial boosts in their performance. We also show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a class of models naturally inclined towards the continual learning regime.

Cite

Text

Li et al. "Energy-Based Models for Continual Learning." ICLR 2021 Workshops: EBM, 2021.

Markdown

[Li et al. "Energy-Based Models for Continual Learning." ICLR 2021 Workshops: EBM, 2021.](https://mlanthology.org/iclrw/2021/li2021iclrw-energybased/)

BibTeX

@inproceedings{li2021iclrw-energybased,
  title     = {{Energy-Based Models for Continual Learning}},
  author    = {Li, Shuang and Du, Yilun and van de Ven, Gido Martijn and Mordatch, Igor},
  booktitle = {ICLR 2021 Workshops: EBM},
  year      = {2021},
  url       = {https://mlanthology.org/iclrw/2021/li2021iclrw-energybased/}
}