SMO Algorithm for Least-Squares SVM Formulation

Abstract

This article extends the well-known SMO algorithm of support vector machines (SVMs) to least-squares SVM formulations that include LS-SVM classification, kernel ridge regression, and a particular form of regularized kernel Fisher discriminant. The algorithm is shown to be asymptotically convergent. It is also extremely easy to implement. Computational experiments show that the algorithm is fast and scales efficiently (quadratically) as a function of the number of examples.

Cite

Text

Keerthi and Shevade. "SMO Algorithm for Least-Squares SVM Formulation." Neural Computation, 2003. doi:10.1162/089976603762553013

Markdown

[Keerthi and Shevade. "SMO Algorithm for Least-Squares SVM Formulation." Neural Computation, 2003.](https://mlanthology.org/neco/2003/keerthi2003neco-smo/) doi:10.1162/089976603762553013

BibTeX

@article{keerthi2003neco-smo,
  title     = {{SMO Algorithm for Least-Squares SVM Formulation}},
  author    = {Keerthi, S. Sathiya and Shevade, Shirish K.},
  journal   = {Neural Computation},
  year      = {2003},
  pages     = {487-507},
  doi       = {10.1162/089976603762553013},
  volume    = {15},
  url       = {https://mlanthology.org/neco/2003/keerthi2003neco-smo/}
}