Stopping Conditions for Exact Computation of Leave-One-Out Error in Support Vector Machines

Abstract

We propose a new stopping condition for a Support Vector Machine (SVM) solver which precisely reflects the objective of the Leave-One-Out error computation. The stopping condition guarantees that the output on an intermediate SVM solution is identical to the output of the optimal SVM solution with one data point excluded from the training set. A simple augmentation of a general SVM training algorithm allows one to use a stopping criterion equivalent to the proposed sufficient condition. A comprehensive experimental evaluation of our method shows consistent speedup of the exact LOO computation by our method, up to the factor of 13 for the linear kernel. The new algorithm can be seen as an example of constructive guidance of an optimization algorithm towards achieving the best attainable expected risk at optimal computational cost.

Cite

Text

Franc et al. "Stopping Conditions for Exact Computation of Leave-One-Out Error in Support Vector Machines." International Conference on Machine Learning, 2008. doi:10.1145/1390156.1390198

Markdown

[Franc et al. "Stopping Conditions for Exact Computation of Leave-One-Out Error in Support Vector Machines." International Conference on Machine Learning, 2008.](https://mlanthology.org/icml/2008/franc2008icml-stopping/) doi:10.1145/1390156.1390198

BibTeX

@inproceedings{franc2008icml-stopping,
  title     = {{Stopping Conditions for Exact Computation of Leave-One-Out Error in Support Vector Machines}},
  author    = {Franc, Vojtech and Laskov, Pavel and Müller, Klaus-Robert},
  booktitle = {International Conference on Machine Learning},
  year      = {2008},
  pages     = {328-335},
  doi       = {10.1145/1390156.1390198},
  url       = {https://mlanthology.org/icml/2008/franc2008icml-stopping/}
}