Tighter PAC-Bayes Bounds
Abstract
This paper proposes a PAC-Bayes bound to measure the performance of Support Vector Machine (SVM) classifiers. The bound is based on learning a prior over the distribution of classifiers with a part of the training samples. Experimental work shows that this bound is tighter than the original PAC-Bayes, resulting in an enhancement of the predictive capabilities of the PAC-Bayes bound. In addition, it is shown that the use of this bound as a means to estimate the hyperparameters of the classifier compares favourably with cross validation in terms of accuracy of the model, while saving a lot of computational burden.
Cite
Text
Ambroladze et al. "Tighter PAC-Bayes Bounds." Neural Information Processing Systems, 2006.Markdown
[Ambroladze et al. "Tighter PAC-Bayes Bounds." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/ambroladze2006neurips-tighter/)BibTeX
@inproceedings{ambroladze2006neurips-tighter,
title = {{Tighter PAC-Bayes Bounds}},
author = {Ambroladze, Amiran and Parrado-hernández, Emilio and Shawe-taylor, John S.},
booktitle = {Neural Information Processing Systems},
year = {2006},
pages = {9-16},
url = {https://mlanthology.org/neurips/2006/ambroladze2006neurips-tighter/}
}