PAC-Bayesian Bounds Based on the Rényi Divergence
Abstract
We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the "customization" of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PAC-Bayesian bounds are based on the Kullback-Leibler divergence. Finally, we present an empirical evaluation of the tightness of each inequality of the simplified proof, for both the classical PAC-Bayesian bounds and those based on the Rényi divergence.
Cite
Text
Bégin et al. "PAC-Bayesian Bounds Based on the Rényi Divergence." International Conference on Artificial Intelligence and Statistics, 2016.Markdown
[Bégin et al. "PAC-Bayesian Bounds Based on the Rényi Divergence." International Conference on Artificial Intelligence and Statistics, 2016.](https://mlanthology.org/aistats/2016/begin2016aistats-pac/)BibTeX
@inproceedings{begin2016aistats-pac,
title = {{PAC-Bayesian Bounds Based on the Rényi Divergence}},
author = {Bégin, Luc and Germain, Pascal and Laviolette, François and Roy, Jean-Francis},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2016},
pages = {435-444},
url = {https://mlanthology.org/aistats/2016/begin2016aistats-pac/}
}