Overcoming Catastrophic Forgetting by Bayesian Generative Regularization
Abstract
In this paper, we propose a new method to over-come catastrophic forgetting by adding generative regularization to Bayesian inference frame-work. Bayesian method provides a general frame-work for continual learning. We could further construct a generative regularization term for all given classification models by leveraging energy-based models and Langevin dynamic sampling to enrich the features learned in each task. By combining discriminative and generative loss together, we empirically show that the proposed method outperforms state-of-the-art methods on a variety of tasks, avoiding catastrophic forgetting in continual learning. In particular, the proposed method outperforms baseline methods over 15%on the Fashion-MNIST dataset and 10%on the CUB dataset.
Cite
Text
Chen et al. "Overcoming Catastrophic Forgetting by Bayesian Generative Regularization." International Conference on Machine Learning, 2021.Markdown
[Chen et al. "Overcoming Catastrophic Forgetting by Bayesian Generative Regularization." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/chen2021icml-overcoming/)BibTeX
@inproceedings{chen2021icml-overcoming,
title = {{Overcoming Catastrophic Forgetting by Bayesian Generative Regularization}},
author = {Chen, Pei-Hung and Wei, Wei and Hsieh, Cho-Jui and Dai, Bo},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {1760-1770},
volume = {139},
url = {https://mlanthology.org/icml/2021/chen2021icml-overcoming/}
}