Memory Replay with Data Compression for Continual Learning

Abstract

Continual learning needs to overcome catastrophic forgetting of the past. Memory replay of representative old training samples has been shown as an effective solution, and achieves the state-of-the-art (SOTA) performance. However, existing work is mainly built on a small memory buffer containing a few original data, which cannot fully characterize the old data distribution. In this work, we propose memory replay with data compression to reduce the storage cost of old training samples and thus increase their amount that can be stored in the memory buffer. Observing that the trade-off between the quality and quantity of compressed data is highly nontrivial for the efficacy of memory replay, we propose a novel method based on determinantal point processes (DPPs) to efficiently determine an appropriate compression quality for currently-arrived training samples. In this way, using a naive data compression algorithm with a properly selected quality can largely boost recent strong baselines by saving more compressed data in a limited storage space. We extensively validate this across several benchmarks of class-incremental learning and in a realistic scenario of object detection for autonomous driving.

Cite

Text

Wang et al. "Memory Replay with Data Compression for Continual Learning." International Conference on Learning Representations, 2022.

Markdown

[Wang et al. "Memory Replay with Data Compression for Continual Learning." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/wang2022iclr-memory/)

BibTeX

@inproceedings{wang2022iclr-memory,
  title     = {{Memory Replay with Data Compression for Continual Learning}},
  author    = {Wang, Liyuan and Zhang, Xingxing and Yang, Kuo and Yu, Longhui and Li, Chongxuan and Hong, Lanqing and Zhang, Shifeng and Li, Zhenguo and Zhong, Yi and Zhu, Jun},
  booktitle = {International Conference on Learning Representations},
  year      = {2022},
  url       = {https://mlanthology.org/iclr/2022/wang2022iclr-memory/}
}