Leave-One-Out Bounds for Kernel Methods
Abstract
In this article, we study leave-one-out style cross-validation bounds for kernel methods. The essential element in our analysis is a bound on the parameter estimation stability for regularized kernel formulations. Using this result, we derive bounds on expected leave-one-out cross-validation errors, which lead to expected generalization bounds for various kernel algorithms. In addition, we also obtain variance bounds for leave-oneout errors. We apply our analysis to some classification and regression problems and compare them with previous results.
Cite
Text
Zhang. "Leave-One-Out Bounds for Kernel Methods." Neural Computation, 2003. doi:10.1162/089976603321780326Markdown
[Zhang. "Leave-One-Out Bounds for Kernel Methods." Neural Computation, 2003.](https://mlanthology.org/neco/2003/zhang2003neco-leaveoneout/) doi:10.1162/089976603321780326BibTeX
@article{zhang2003neco-leaveoneout,
title = {{Leave-One-Out Bounds for Kernel Methods}},
author = {Zhang, Tong},
journal = {Neural Computation},
year = {2003},
pages = {1397-1437},
doi = {10.1162/089976603321780326},
volume = {15},
url = {https://mlanthology.org/neco/2003/zhang2003neco-leaveoneout/}
}