Scalable Bayesian Learning with Posteriors
Abstract
Although theoretically compelling, Bayesian learning with modern machine learning models is computationally challenging since it requires approximating a high dimensional posterior distribution. In this work, we (i) introduce **_posteriors_**, an easily extensible PyTorch library hosting general-purpose implementations making Bayesian learning accessible and scalable to large data and parameter regimes; (ii) present a tempered framing of stochastic gradient Markov chain Monte Carlo, as implemented in posteriors, that transitions seamlessly into optimization and unveils a minor modification to deep ensembles to ensure they are asymptotically unbiased for the Bayesian posterior, and (iii) demonstrate and compare the utility of Bayesian approximations through experiments including an investigation into the cold posterior effect and applications with large language models. _**posteriors**_ repository: https://github.com/normal-computing/posteriors
Cite
Text
Duffield et al. "Scalable Bayesian Learning with Posteriors." International Conference on Learning Representations, 2025.Markdown
[Duffield et al. "Scalable Bayesian Learning with Posteriors." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/duffield2025iclr-scalable/)BibTeX
@inproceedings{duffield2025iclr-scalable,
title = {{Scalable Bayesian Learning with Posteriors}},
author = {Duffield, Samuel and Donatella, Kaelan and Chiu, Johnathan and Klett, Phoebe and Simpson, Daniel},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/duffield2025iclr-scalable/}
}