Improving Efficiency of SVM K-Fold Cross-Validation by Alpha Seeding

Abstract

The k-fold cross-validation is commonly used to evaluate the effectiveness of SVMs with the selected hyper-parameters. It is known that the SVM k-fold cross-validation is expensive, since it requires training k SVMs. However, little work has explored reusing the h-th SVM for training the (h+1)-th SVM for improving the efficiency of k-fold cross-validation. In this paper, we propose three algorithms that reuse the h-th SVM for improving the efficiency of training the (h+1)-th SVM. Our key idea is to efficiently identify the support vectors and to accurately estimate their associated weights (also called alpha values) of the next SVM by using the previous SVM. Our experimental results show that our algorithms are several times faster than the k-fold cross-validation which does not make use of the previously trained SVM. Moreover, our algorithms produce the same results (hence same accuracy) as the k-fold cross-validation which does not make use of the previously trained SVM.

Cite

Text

Wen et al. "Improving Efficiency of SVM K-Fold Cross-Validation by Alpha Seeding." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.10785

Markdown

[Wen et al. "Improving Efficiency of SVM K-Fold Cross-Validation by Alpha Seeding." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/wen2017aaai-improving/) doi:10.1609/AAAI.V31I1.10785

BibTeX

@inproceedings{wen2017aaai-improving,
  title     = {{Improving Efficiency of SVM K-Fold Cross-Validation by Alpha Seeding}},
  author    = {Wen, Zeyi and Li, Bin and Ramamohanarao, Kotagiri and Chen, Jian and Chen, Yawen and Zhang, Rui},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {2768-2774},
  doi       = {10.1609/AAAI.V31I1.10785},
  url       = {https://mlanthology.org/aaai/2017/wen2017aaai-improving/}
}