Bayesian Learning-Driven Prototypical Contrastive Loss for Class-Incremental Learning
Abstract
The primary objective of methods in continual learning is to learn tasks in a sequential manner over time (sometimes from a stream of data), while mitigating the detrimental phenomenon of catastrophic forgetting. This paper proposes a method to learn an effective representation between previous and newly encountered class prototypes. We propose a prototypical network with a Bayesian learning-driven contrastive loss (BLCL), tailored specifically for class-incremental learning scenarios. We introduce a contrastive loss that incorporates novel classes into the latent representation by reducing intra-class and increasing inter-class distance. Our approach dynamically adapts the balance between the cross-entropy and contrastive loss functions with a Bayesian learning technique. Experimental results conducted on the CIFAR-10, CIFAR-100, and ImageNet100 datasets for image classification and images of a GNSS-based dataset for interference classification validate the efficacy of our method, showcasing its superiority over existing state-of-the-art approaches.
Cite
Text
Raichur et al. "Bayesian Learning-Driven Prototypical Contrastive Loss for Class-Incremental Learning." Transactions on Machine Learning Research, 2025.Markdown
[Raichur et al. "Bayesian Learning-Driven Prototypical Contrastive Loss for Class-Incremental Learning." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/raichur2025tmlr-bayesian/)BibTeX
@article{raichur2025tmlr-bayesian,
title = {{Bayesian Learning-Driven Prototypical Contrastive Loss for Class-Incremental Learning}},
author = {Raichur, Nisha L. and Heublein, Lucas and Feigl, Tobias and Rügamer, Alexander and Mutschler, Christopher and Ott, Felix},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/raichur2025tmlr-bayesian/}
}