Bayesian Ensembling: Insights from Online Optimization and Empirical Bayes

Abstract

We revisit the classical problem of Bayesian ensembles and address the challenge of learning optimal combinations of Bayesian models in an online learning setting. To this end, we reinterpret existing approaches such as Bayesian model averaging (BMA) and Bayesian stacking through a novel empirical Bayes lens, shedding new light on the limitations and pathologies of BMA. Further motivated by insights from online optimization, we propose Online Bayesian Stacking (OBS), a method that optimizes the log-score over predictive distributions to adaptively combine Bayesian models. A key contribution of our work is establishing a novel connection between OBS and portfolio selection, bridging Bayesian ensemble learning with a rich, well-studied theoretical framework that offers efficient algorithms and extensive regret analysis. We further clarify the relationship between OBS and online BMA, showing that they optimize related but distinct cost functions. Through theoretical analysis and empirical evaluation, we identify scenarios where OBS outperforms online BMA and provide principled methods and guidance on when practitioners should prefer one approach over the other.

Cite

Text

Waxman et al. "Bayesian Ensembling: Insights from Online Optimization and Empirical Bayes." Transactions on Machine Learning Research, 2026.

Markdown

[Waxman et al. "Bayesian Ensembling: Insights from Online Optimization and Empirical Bayes." Transactions on Machine Learning Research, 2026.](https://mlanthology.org/tmlr/2026/waxman2026tmlr-bayesian/)

BibTeX

@article{waxman2026tmlr-bayesian,
  title     = {{Bayesian Ensembling: Insights from Online Optimization and Empirical Bayes}},
  author    = {Waxman, Daniel and Llorente, Fernando and Djuric, Petar},
  journal   = {Transactions on Machine Learning Research},
  year      = {2026},
  url       = {https://mlanthology.org/tmlr/2026/waxman2026tmlr-bayesian/}
}