A Consolidated Cross-Validation Algorithm for Support Vector Machines via Data Reduction
Abstract
We propose a consolidated cross-validation (CV) algorithm for training and tuning the support vector machines (SVM) on reproducing kernel Hilbert spaces. Our consolidated CV algorithm utilizes a recently proposed exact leave-one-out formula for the SVM and accelerates the SVM computation via a data reduction strategy. In addition, to compute the SVM with the bias term (intercept), which is not handled by the existing data reduction methods, we propose a novel two-stage consolidated CV algorithm. With numerical studies, we demonstrate that our algorithm is about an order of magnitude faster than the two mainstream SVM solvers, kernlab and LIBSVM, with almost the same accuracy.
Cite
Text
Wang and Yang. "A Consolidated Cross-Validation Algorithm for Support Vector Machines via Data Reduction." Neural Information Processing Systems, 2022.Markdown
[Wang and Yang. "A Consolidated Cross-Validation Algorithm for Support Vector Machines via Data Reduction." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/wang2022neurips-consolidated/)BibTeX
@inproceedings{wang2022neurips-consolidated,
title = {{A Consolidated Cross-Validation Algorithm for Support Vector Machines via Data Reduction}},
author = {Wang, Boxiang and Yang, Archer},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/wang2022neurips-consolidated/}
}