Two-Dimensional Solution Path for Support Vector Regression
Abstract
Recently, a very appealing approach was proposed to compute the entire solution path for support vector classification (SVC) with very low extra computational cost. This approach was later extended to a support vector regression (SVR) model called ε-SVR. However, the method requires that the error parameter ε be set a priori, which is only possible if the desired accuracy of the approximation can be specified in advance. In this paper, we show that the solution path for ε-SVR is also piecewise linear with respect to ε. We further propose an efficient algorithm for exploring the two-dimensional solution space defined by the regularization and error parameters. As opposed to the algorithm for SVC, our proposed algorithm for ε-SVR initializes the number of support vectors to zero and then increases it gradually as the algorithm proceeds. As such, a good regression function possessing the sparseness property can be obtained after only a few iterations.
Cite
Text
Wang et al. "Two-Dimensional Solution Path for Support Vector Regression." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143969Markdown
[Wang et al. "Two-Dimensional Solution Path for Support Vector Regression." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/wang2006icml-two/) doi:10.1145/1143844.1143969BibTeX
@inproceedings{wang2006icml-two,
title = {{Two-Dimensional Solution Path for Support Vector Regression}},
author = {Wang, Gang and Yeung, Dit-Yan and Lochovsky, Frederick H.},
booktitle = {International Conference on Machine Learning},
year = {2006},
pages = {993-1000},
doi = {10.1145/1143844.1143969},
url = {https://mlanthology.org/icml/2006/wang2006icml-two/}
}