Compression and Restoration: Exploring Elasticity in Continual Test-Time Adaptation
Abstract
Test-time adaptation is a task that a pre-trained source model is updated during inference with given test data from target domains with different distributions. However, frequent updates in a long time without resetting the model will bring two main problems, i.e., error accumulation and catastrophic forgetting. Although some recent methods have alleviated the problems by designing new loss functions or update strategies, they are still very fragile to hyperparameters or suffer from storage burden. Besides, most methods treat each target domain equally, neglecting the characteristics of each target domain and the situation of the current model, which will mislead the update direction of the model. To address the above issues, we first leverage the mean cosine similarity per test batch between the features output by the source and updated models to measure the change of target domains. Then we summarize the elasticity of the mean cosine similarity to guide the model to update and restore adaptively. Motivated by this, we propose a frustratingly simple yet efficient method called Elastic-Test-time ENTropy Minimization (E-TENT) to dynamically adjust the mean cosine similarity based on the built relationship between it and the momentum coefficient. Combined with the extra three minimal improvements, E-TENT exhibits significant performance gains and strong robustness on CIFAR10-C, CIFAR100-C and ImageNet-C along with various practical scenarios.
Cite
Text
Li et al. "Compression and Restoration: Exploring Elasticity in Continual Test-Time Adaptation." Machine Learning, 2025. doi:10.1007/S10994-025-06739-8Markdown
[Li et al. "Compression and Restoration: Exploring Elasticity in Continual Test-Time Adaptation." Machine Learning, 2025.](https://mlanthology.org/mlj/2025/li2025mlj-compression/) doi:10.1007/S10994-025-06739-8BibTeX
@article{li2025mlj-compression,
title = {{Compression and Restoration: Exploring Elasticity in Continual Test-Time Adaptation}},
author = {Li, Jingwei and Liu, Chengbao and Bai, Xiwei and Tan, Jie and Chu, Jiaqi and Wang, Yudong},
journal = {Machine Learning},
year = {2025},
pages = {104},
doi = {10.1007/S10994-025-06739-8},
volume = {114},
url = {https://mlanthology.org/mlj/2025/li2025mlj-compression/}
}