Gaussian Processes for Classification: Mean-Field Algorithms
Abstract
We derive a mean-field algorithm for binary classification with gaussian processes that is based on the TAP approach originally proposed in statistical physics of disordered systems. The theory also yields an approximate leave-one-out estimator for the generalization error, which is computed with no extra computational cost. We show that from the TAP approach, it is possible to derive both a simpler “naive” mean-field theory and support vector machines (SVMs) as limiting cases. For both mean-field algorithms and support vector machines, simulation results for three small benchmark data sets are presented. They show that one may get state-of-the-art performance by using the leave-one-out estimator for model selection and the built-in leave-one-out estimators are extremely precise when compared to the exact leave-one-out estimate. The second result is taken as strong support for the internal consistency of the mean-field approach.
Cite
Text
Opper and Winther. "Gaussian Processes for Classification: Mean-Field Algorithms." Neural Computation, 2000. doi:10.1162/089976600300014881Markdown
[Opper and Winther. "Gaussian Processes for Classification: Mean-Field Algorithms." Neural Computation, 2000.](https://mlanthology.org/neco/2000/opper2000neco-gaussian/) doi:10.1162/089976600300014881BibTeX
@article{opper2000neco-gaussian,
title = {{Gaussian Processes for Classification: Mean-Field Algorithms}},
author = {Opper, Manfred and Winther, Ole},
journal = {Neural Computation},
year = {2000},
pages = {2655-2684},
doi = {10.1162/089976600300014881},
volume = {12},
url = {https://mlanthology.org/neco/2000/opper2000neco-gaussian/}
}