Task Agnostic Representation Consolidation: A Self-Supervised Based Continual Learning Approach
Abstract
Continual learning (CL) over non-stationary data streams remains one of the long-standing challenges in deep neural networks (DNNs) as they are prone to catastrophic forgetting. CL models can benefit from self-supervised pre-training as it enables learning more generalizable task-agnostic features. However, the effect of self-supervised pre-training diminishes as the length of task sequences increases. Furthermore, the domain shift between pre-training data distribution and the task distribution reduces the generalizability of the learned representations. To address these limitations, we propose Task Agnostic Representation Consolidation (TARC), a novel two-stage training paradigm for CL that intertwines task-agnostic and task-specific learning whereby self-supervised training is followed by supervised learning for each task. To further restrict the deviation from the learned representations in the self-supervised stage, we employ a task-agnostic auxiliary loss during the supervised stage. We show that our training paradigm can be easily added to memory- or regularization-based approaches and provides consistent performance gain across more challenging CL settings. We further show that it leads to more robust and well-calibrated models.
Cite
Text
Bhat et al. "Task Agnostic Representation Consolidation: A Self-Supervised Based Continual Learning Approach." Proceedings of The 1st Conference on Lifelong Learning Agents, 2022.Markdown
[Bhat et al. "Task Agnostic Representation Consolidation: A Self-Supervised Based Continual Learning Approach." Proceedings of The 1st Conference on Lifelong Learning Agents, 2022.](https://mlanthology.org/collas/2022/bhat2022collas-task/)BibTeX
@inproceedings{bhat2022collas-task,
title = {{Task Agnostic Representation Consolidation: A Self-Supervised Based Continual Learning Approach}},
author = {Bhat, Prashant Shivaram and Zonooz, Bahram and Arani, Elahe},
booktitle = {Proceedings of The 1st Conference on Lifelong Learning Agents},
year = {2022},
pages = {390-405},
volume = {199},
url = {https://mlanthology.org/collas/2022/bhat2022collas-task/}
}