Task-Agnostic Continual Learning with Hybrid Probabilistic Models

Abstract

Learning new tasks continuously without forgetting on a constantly changing data distribution is essential for real-world problems but extremely challenging for modern deep learning. In this work we propose HCL, a Hybrid generative-discriminative approach to Continual Learning for classification. We model the distribution of each task and each class with a normalizing flow. The flow is used to learn the data distribution, perform classification, identify task changes and avoid forgetting, all leveraging the invertibility and exact likelihood which are uniquely enabled by the normalizing flow model. We use the generative capabilities of the flow to avoid catastrophic forgetting through generative replay and a novel functional regularization technique. For task identification, we use state-of-the-art anomaly detection techniques based on measuring the typicality of model's statistics. We demonstrate the strong performance of HCL on a range of continual learning benchmarks such as split-MNIST, split-CIFAR and SVHN-MNIST.

Cite

Text

Kirichenko et al. "Task-Agnostic Continual Learning with Hybrid Probabilistic Models." ICML 2021 Workshops: INNF, 2021.

Markdown

[Kirichenko et al. "Task-Agnostic Continual Learning with Hybrid Probabilistic Models." ICML 2021 Workshops: INNF, 2021.](https://mlanthology.org/icmlw/2021/kirichenko2021icmlw-taskagnostic/)

BibTeX

@inproceedings{kirichenko2021icmlw-taskagnostic,
  title     = {{Task-Agnostic Continual Learning with Hybrid Probabilistic Models}},
  author    = {Kirichenko, Polina and Farajtabar, Mehrdad and Rao, Dushyant and Lakshminarayanan, Balaji and Levine, Nir and Li, Ang and Hu, Huiyi and Wilson, Andrew Gordon and Pascanu, Razvan},
  booktitle = {ICML 2021 Workshops: INNF},
  year      = {2021},
  url       = {https://mlanthology.org/icmlw/2021/kirichenko2021icmlw-taskagnostic/}
}