Predicting the Susceptibility of Examples to Catastrophic Forgetting
Abstract
Catastrophic forgetting – the tendency of neural networks to forget previously learned data when learning new information – remains a central challenge in continual learning. In this work, we adopt a behavioral approach, observing a connection between learning speed and forgetting: examples learned more quickly are less prone to forgetting. Focusing on replay-based continual learning, we show that the composition of the replay buffer – specifically, whether it contains quickly or slowly learned examples – has a significant effect on forgetting. Motivated by this insight, we introduce Speed-Based Sampling (SBS), a simple yet general strategy that selects replay examples based on their learning speed. SBS integrates easily into existing buffer-based methods and improves performance across a wide range of competitive continual learning benchmarks, advancing state-of-the-art results. Our findings underscore the value of accounting for the forgetting dynamics when designing continual learning algorithms.
Cite
Text
Hacohen and Tuytelaars. "Predicting the Susceptibility of Examples to Catastrophic Forgetting." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Hacohen and Tuytelaars. "Predicting the Susceptibility of Examples to Catastrophic Forgetting." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/hacohen2025icml-predicting/)BibTeX
@inproceedings{hacohen2025icml-predicting,
title = {{Predicting the Susceptibility of Examples to Catastrophic Forgetting}},
author = {Hacohen, Guy and Tuytelaars, Tinne},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {21546-21569},
volume = {267},
url = {https://mlanthology.org/icml/2025/hacohen2025icml-predicting/}
}