On Bayesian Bounds

Abstract

We show that several important Bayesian bounds studied in machine learning, both in the batch as well as the online setting, arise by an application of a simple compression lemma. In particular, we derive (i) PAC-Bayesian bounds in the batch setting, (ii) Bayesian log-loss bounds and (iii) Bayesian bounded-loss bounds in the online setting using the compression lemma. Although every setting has different semantics for prior, posterior and loss, we show that the core bound argument is the same. The paper simplifies our understanding of several important and apparently disparate results, as well as brings to light a powerful tool for developing similar arguments for other methods.

Cite

Text

Banerjee. "On Bayesian Bounds." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143855

Markdown

[Banerjee. "On Bayesian Bounds." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/banerjee2006icml-bayesian/) doi:10.1145/1143844.1143855

BibTeX

@inproceedings{banerjee2006icml-bayesian,
  title     = {{On Bayesian Bounds}},
  author    = {Banerjee, Arindam},
  booktitle = {International Conference on Machine Learning},
  year      = {2006},
  pages     = {81-88},
  doi       = {10.1145/1143844.1143855},
  url       = {https://mlanthology.org/icml/2006/banerjee2006icml-bayesian/}
}