Transfer Without Forgetting

Abstract

This work investigates the entanglement between Continual Learning (CL) and Transfer Learning (TL). In particular, we shed light on the widespread application of network pretraining, highlighting that it is itself subject to catastrophic forgetting. Unfortunately, this issue leads to the under-exploitation of knowledge transfer during later tasks. On this ground, we propose Transfer without Forgetting (TwF), a hybrid approach building upon a fixed pretrained sibling network, which continuously propagates the knowledge inherent in the source domain through a layer-wise loss term. Our experiments indicate that TwF steadily outperforms other CL methods across a variety of settings, averaging a 4.81% gain in Class-Incremental accuracy over a variety of datasets and different buffer sizes.

Cite

Text

Boschini et al. "Transfer Without Forgetting." Proceedings of the European Conference on Computer Vision (ECCV), 2022. doi:10.1007/978-3-031-20050-2_40

Markdown

[Boschini et al. "Transfer Without Forgetting." Proceedings of the European Conference on Computer Vision (ECCV), 2022.](https://mlanthology.org/eccv/2022/boschini2022eccv-transfer/) doi:10.1007/978-3-031-20050-2_40

BibTeX

@inproceedings{boschini2022eccv-transfer,
  title     = {{Transfer Without Forgetting}},
  author    = {Boschini, Matteo and Bonicelli, Lorenzo and Porrello, Angelo and Bellitto, Giovanni and Pennisi, Matteo and Palazzo, Simone and Spampinato, Concetto and Calderara, Simone},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2022},
  doi       = {10.1007/978-3-031-20050-2_40},
  url       = {https://mlanthology.org/eccv/2022/boschini2022eccv-transfer/}
}