Temperature Optimization for Bayesian Deep Learning
Abstract
The Cold Posterior Effect (CPE) is a phenomenon in Bayesian Deep Learning (BDL), where tempering the posterior to a cold temperature often improves the predictive performance of the posterior predictive distribution (PPD). Although the term ‘CPE’ suggests colder temperatures are inherently better, the BDL community increasingly recognizes that this is not always the case. Despite this, there remains no systematic method for finding the optimal temperature beyond grid search. In this work, we propose a data-driven approach to select the temperature that maximizes test log-predictive density, treating the temperature as a model parameter and estimating it directly from the data. We empirically demonstrate that our method performs comparably to grid search, at a fraction of the cost, across both regression and classification tasks. Finally, we highlight the differing perspectives on CPE between the BDL and Generalized Bayes communities: while the former primarily emphasizes the predictive performance of the PPD, the latter prioritizes the utility of the posterior under model misspecification; these distinct objectives lead to different temperature preferences.
Cite
Text
Ng et al. "Temperature Optimization for Bayesian Deep Learning." Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, 2025.Markdown
[Ng et al. "Temperature Optimization for Bayesian Deep Learning." Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, 2025.](https://mlanthology.org/uai/2025/ng2025uai-temperature/)BibTeX
@inproceedings{ng2025uai-temperature,
title = {{Temperature Optimization for Bayesian Deep Learning}},
author = {Ng, Kenyon and Heide, Chris and Hodgkinson, Liam and Wei, Susan},
booktitle = {Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence},
year = {2025},
pages = {3155-3181},
volume = {286},
url = {https://mlanthology.org/uai/2025/ng2025uai-temperature/}
}