COME: Test-Time Adaption by Conservatively Minimizing Entropy

Abstract

Machine learning models must continuously self-adjust themselves for novel data distribution in the open world. As the predominant principle, entropy minimization (EM) has been proven to be a simple yet effective cornerstone in existing test-time adaption (TTA) methods. While unfortunately its fatal limitation (i.e., overconfidence) tends to result in model collapse. For this issue, we propose to \textbf{\texttt{Co}}nservatively \textbf{\texttt{M}}inimize the \textbf{\texttt{E}}ntropy (\texttt{COME}), which is a simple drop-in replacement of traditional EM to elegantly address the limitation. In essence, \texttt{COME} explicitly models the uncertainty by characterizing a Dirichlet prior distribution over model predictions during TTA. By doing so, \texttt{COME} naturally regularizes the model to favor conservative confidence on unreliable samples. Theoretically, we provide a preliminary analysis to reveal the ability of \texttt{COME} in enhancing the optimization stability by introducing a data-adaptive lower bound on the entropy. Empirically, our method achieves state-of-the-art performance on commonly used benchmarks, showing significant improvements in terms of classification accuracy and uncertainty estimation under various settings including standard, life-long and open-world TTA, i.e., up to $34.5\%$ improvement on accuracy and $15.1\%$ on false positive rate. Our code is available at: \href{https://github.com/BlueWhaleLab/COME}https://github.com/BlueWhaleLab/COME.

Cite

Text

Zhang et al. "COME: Test-Time Adaption by Conservatively Minimizing Entropy." International Conference on Learning Representations, 2025.

Markdown

[Zhang et al. "COME: Test-Time Adaption by Conservatively Minimizing Entropy." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/zhang2025iclr-come/)

BibTeX

@inproceedings{zhang2025iclr-come,
  title     = {{COME: Test-Time Adaption by Conservatively Minimizing Entropy}},
  author    = {Zhang, Qingyang and Bian, Yatao and Kong, Xinke and Zhao, Peilin and Zhang, Changqing},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/zhang2025iclr-come/}
}