Polynomial-Time Decomposition Algorithms for Support Vector Machines
Abstract
This paper studies the convergence properties of a general class of decomposition algorithms for support vector machines (SVMs). We provide a model algorithm for decomposition, and prove necessary and sufficient conditions for stepwise improvement of this algorithm. We introduce a simple “rate certifying” condition and prove a polynomial-time bound on the rate of convergence of the model algorithm when it satisfies this condition. Although it is not clear that existing SVM algorithms satisfy this condition, we provide a version of the model algorithm that does. For this algorithm we show that when the slack multiplier C satisfies √1/2 ≤ C ≤ mL , where m is the number of samples and L is a matrix norm, then it takes no more than 4 LC 2 m 4 /∈ iterations to drive the criterion to within ∈ of its optimum.
Cite
Text
Hush and Scovel. "Polynomial-Time Decomposition Algorithms for Support Vector Machines." Machine Learning, 2003. doi:10.1023/A:1021877911972Markdown
[Hush and Scovel. "Polynomial-Time Decomposition Algorithms for Support Vector Machines." Machine Learning, 2003.](https://mlanthology.org/mlj/2003/hush2003mlj-polynomialtime/) doi:10.1023/A:1021877911972BibTeX
@article{hush2003mlj-polynomialtime,
title = {{Polynomial-Time Decomposition Algorithms for Support Vector Machines}},
author = {Hush, Don R. and Scovel, Clint},
journal = {Machine Learning},
year = {2003},
pages = {51-71},
doi = {10.1023/A:1021877911972},
volume = {51},
url = {https://mlanthology.org/mlj/2003/hush2003mlj-polynomialtime/}
}