Contextual Transformation Networks for Online Continual Learning

Abstract

Continual learning methods with fixed architectures rely on a single network to learn models that can perform well on all tasks. As a result, they often only accommodate common features of those tasks but neglect each task's specific features. On the other hand, dynamic architecture methods can have a separate network for each task, but they are too expensive to train and not scalable in practice, especially in online settings. To address this problem, we propose a novel online continual learning method named ``Contextual Transformation Networks” (CTN) to efficiently model the \emph{task-specific features} while enjoying neglectable complexity overhead compared to other fixed architecture methods. Moreover, inspired by the Complementary Learning Systems (CLS) theory, we propose a novel dual memory design and an objective to train CTN that can address both catastrophic forgetting and knowledge transfer simultaneously. Our extensive experiments show that CTN is competitive with a large scale dynamic architecture network and consistently outperforms other fixed architecture methods under the same standard backbone. Our implementation can be found at \url{https://github.com/phquang/Contextual-Transformation-Network}.

Cite

Text

Pham et al. "Contextual Transformation Networks for Online Continual Learning." International Conference on Learning Representations, 2021.

Markdown

[Pham et al. "Contextual Transformation Networks for Online Continual Learning." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/pham2021iclr-contextual/)

BibTeX

@inproceedings{pham2021iclr-contextual,
  title     = {{Contextual Transformation Networks for Online Continual Learning}},
  author    = {Pham, Quang and Liu, Chenghao and Sahoo, Doyen and Hoi, Steven},
  booktitle = {International Conference on Learning Representations},
  year      = {2021},
  url       = {https://mlanthology.org/iclr/2021/pham2021iclr-contextual/}
}