Incremental Learning of Structured Memory via Closed-Loop Transcription

Abstract

This work proposes a minimal computational model for learning structured memories of multiple object classes in an incremental setting. Our approach is based on establishing a {\em closed-loop transcription} between the classes and a corresponding set of subspaces, known as a linear discriminative representation, in a low-dimensional feature space. Our method is simpler than existing approaches for incremental learning, and more efficient in terms of model size, storage, and computation: it requires only a single, fixed-capacity autoencoding network with a feature space that is used for both discriminative and generative purposes. Network parameters are optimized simultaneously without architectural manipulations, by solving a constrained minimax game between the encoding and decoding maps over a single rate reduction-based objective. Experimental results show that our method can effectively alleviate catastrophic forgetting, achieving significantly better performance than prior work of generative replay on MNIST, CIFAR-10, and ImageNet-50, despite requiring fewer resources.

Cite

Text

Tong et al. "Incremental Learning of Structured Memory via Closed-Loop Transcription." International Conference on Learning Representations, 2023.

Markdown

[Tong et al. "Incremental Learning of Structured Memory via Closed-Loop Transcription." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/tong2023iclr-incremental/)

BibTeX

@inproceedings{tong2023iclr-incremental,
  title     = {{Incremental Learning of Structured Memory via Closed-Loop Transcription}},
  author    = {Tong, Shengbang and Dai, Xili and Wu, Ziyang and Li, Mingyang and Yi, Brent and Ma, Yi},
  booktitle = {International Conference on Learning Representations},
  year      = {2023},
  url       = {https://mlanthology.org/iclr/2023/tong2023iclr-incremental/}
}