EVCL: Elastic Variational Continual Learning with Weight Consolidation
Abstract
Continual learning aims to allow models to learn new tasks without forgetting what has been learned before. This work introduces Elastic Variational Continual Learning with Weight Consolidation (EVCL), a novel hybrid model that integrates the variational posterior approximation mechanism of Variational Continual Learning (VCL) with the regularization-based parameter-protection strategy of Elastic Weight Consolidation (EWC). By combining the strengths of both methods, EVCL effectively mitigates catastrophic forgetting and enables better capture of dependencies between model parameters and task-specific data. Evaluated on five discriminative tasks, EVCL consistently outperforms existing baselines in both domain-incremental and task-incremental learning scenarios for deep discriminative models.
Cite
Text
Batra and Clark. "EVCL: Elastic Variational Continual Learning with Weight Consolidation." ICML 2024 Workshops: SPIGM, 2024.Markdown
[Batra and Clark. "EVCL: Elastic Variational Continual Learning with Weight Consolidation." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/batra2024icmlw-evcl/)BibTeX
@inproceedings{batra2024icmlw-evcl,
title = {{EVCL: Elastic Variational Continual Learning with Weight Consolidation}},
author = {Batra, Hunar and Clark, Ronald},
booktitle = {ICML 2024 Workshops: SPIGM},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/batra2024icmlw-evcl/}
}