Dual-Phase Continual Learning: Supervised Adaptation Meets Unsupervised Retention

Abstract

Foundational vision-language models (VLMs) excel across diverse tasks, but adapting them to new domains without forgetting prior knowledge remains a critical challenge. Continual Learning (CL) addresses this challenge by enabling models to learn sequentially from new data while mitigating the forgetting of prior information, typically under supervised settings involving label shift. Nonetheless, abrupt distribution shifts can still cause substantial forgetting, potentially nullifying the benefits of supervised updates, especially when storing or replaying past data is infeasible. In this work, we propose leveraging unlabeled test-time data in an unsupervised manner to reinforce prior task performance without requiring replay or stored examples. Unlike traditional Test-Time Adaptation (TTA), which primarily focuses on domain shift or corruption, our method improves performance on earlier tasks by exploiting representative test samples encountered during deployment. We introduce a simple teacher-student framework with gradient-based sparse parameter updates, and show that it effectively mitigates forgetting in class-incremental CL for VLMs, offering a memory-free alternative to episodic replay with strong empirical results.

Cite

Text

Singh et al. "Dual-Phase Continual Learning: Supervised Adaptation Meets Unsupervised Retention." Transactions on Machine Learning Research, 2026.

Markdown

[Singh et al. "Dual-Phase Continual Learning: Supervised Adaptation Meets Unsupervised Retention." Transactions on Machine Learning Research, 2026.](https://mlanthology.org/tmlr/2026/singh2026tmlr-dualphase/)

BibTeX

@article{singh2026tmlr-dualphase,
  title     = {{Dual-Phase Continual Learning: Supervised Adaptation Meets Unsupervised Retention}},
  author    = {Singh, Vaibhav and Aljundi, Rahaf and Belilovsky, Eugene},
  journal   = {Transactions on Machine Learning Research},
  year      = {2026},
  url       = {https://mlanthology.org/tmlr/2026/singh2026tmlr-dualphase/}
}