PAC-Bayes Bounds with Data Dependent Priors

Abstract

This paper presents the prior PAC-Bayes bound and explores its capabilities as a tool to provide tight predictions of SVMs' generalization. The computation of the bound involves estimating a prior of the distribution of classifiers from the available data, and then manipulating this prior in the usual PAC-Bayes generalization bound. We explore two alternatives: to learn the prior from a separate data set, or to consider an expectation prior that does not need this separate data set. The prior PAC-Bayes bound motivates two SVM-like classification algorithms, prior SVM and η-prior SVM, whose regularization term pushes towards the minimization of the prior PAC-Bayes bound. The experimental work illustrates that the new bounds can be significantly tighter than the original PAC-Bayes bound when applied to SVMs, and among them the combination of the prior PAC-Bayes bound and the prior SVM algorithm gives the tightest bound.

Cite

Text

Parrado-Hernández et al. "PAC-Bayes Bounds with Data Dependent Priors." Journal of Machine Learning Research, 2012.

Markdown

[Parrado-Hernández et al. "PAC-Bayes Bounds with Data Dependent Priors." Journal of Machine Learning Research, 2012.](https://mlanthology.org/jmlr/2012/parradohernandez2012jmlr-pacbayes/)

BibTeX

@article{parradohernandez2012jmlr-pacbayes,
  title     = {{PAC-Bayes Bounds with Data Dependent Priors}},
  author    = {Parrado-Hernández, Emilio and Ambroladze, Amiran and Shawe-Taylor, John and Sun, Shiliang},
  journal   = {Journal of Machine Learning Research},
  year      = {2012},
  pages     = {3507-3531},
  volume    = {13},
  url       = {https://mlanthology.org/jmlr/2012/parradohernandez2012jmlr-pacbayes/}
}