SVMTorch: Support Vector Machines for Large-Scale Regression Problems (Kernel Machines Section)

Abstract

Support Vector Machines (SVMs) for regression problems are trained by solving a quadratic optimization problem which needs on the order of l square memory and time resources to solve, where l is the number of training examples. In this paper, we propose a decomposition algorithm, SVMTorch (available at http://www.idiap.ch/learning/SVMTorch.html), which is similar to SVM-Light proposed by Joachims (1999) for classification problems, but adapted to regression problems. With this algorithm, one can now efficiently solve large-scale regression problems (more than 20000 examples). Comparisons with Nodelib, another publicly available SVM algorithm for large-scale regression problems from Flake and Lawrence (2000) yielded significant time improvements. Finally, based on a recent paper from Lin (2000), we show that a convergence proof exists for our algorithm.

Cite

Text

Collobert and Bengio. "SVMTorch: Support Vector Machines for Large-Scale Regression Problems     (Kernel Machines Section)." Journal of Machine Learning Research, 2001.

Markdown

[Collobert and Bengio. "SVMTorch: Support Vector Machines for Large-Scale Regression Problems     (Kernel Machines Section)." Journal of Machine Learning Research, 2001.](https://mlanthology.org/jmlr/2001/collobert2001jmlr-svmtorch/)

BibTeX

@article{collobert2001jmlr-svmtorch,
  title     = {{SVMTorch: Support Vector Machines for Large-Scale Regression Problems     (Kernel Machines Section)}},
  author    = {Collobert, Ronan and Bengio, Samy},
  journal   = {Journal of Machine Learning Research},
  year      = {2001},
  pages     = {143-160},
  volume    = {1},
  url       = {https://mlanthology.org/jmlr/2001/collobert2001jmlr-svmtorch/}
}