PAC-Bayes Bounds for the Risk of the Majority Vote and the Variance of the Gibbs Classifier
Abstract
We propose new PAC-Bayes bounds for the risk of the weighted majority vote that depend on the mean and variance of the error of its associated Gibbs classifier. We show that these bounds can be smaller than the risk of the Gibbs classifier and can be arbitrarily close to zero even if the risk of the Gibbs classifier is close to 1/2. Moreover, we show that these bounds can be uniformly estimated on the training data for all possible posteriors Q. Moreover, they can be improved by using a large sample of unlabelled data.
Cite
Text
Lacasse et al. "PAC-Bayes Bounds for the Risk of the Majority Vote and the Variance of the Gibbs Classifier." Neural Information Processing Systems, 2006.Markdown
[Lacasse et al. "PAC-Bayes Bounds for the Risk of the Majority Vote and the Variance of the Gibbs Classifier." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/lacasse2006neurips-pacbayes/)BibTeX
@inproceedings{lacasse2006neurips-pacbayes,
title = {{PAC-Bayes Bounds for the Risk of the Majority Vote and the Variance of the Gibbs Classifier}},
author = {Lacasse, Alexandre and Laviolette, François and Marchand, Mario and Germain, Pascal and Usunier, Nicolas},
booktitle = {Neural Information Processing Systems},
year = {2006},
pages = {769-776},
url = {https://mlanthology.org/neurips/2006/lacasse2006neurips-pacbayes/}
}