Core Vector Regression for Very Large Regression Problems
Abstract
In this paper, we extend the recently proposed Core Vector Machine algorithm to the regression setting by generalizing the underlying minimum enclosing ball problem. The resultant Core Vector Regression (CVR) algorithm can be used with any linear/nonlinear kernels and can obtain provably approximately optimal solutions. Its asymptotic time complexity is linear in the number of training patterns m, while its space complexity is independent of m. Experiments show that CVR has comparable performance with SVR, but is much faster and produces much fewer support vectors on very large data sets. It is also successfully applied to large 3D point sets in computer graphics for the modeling of implicit surfaces.
Cite
Text
Tsang et al. "Core Vector Regression for Very Large Regression Problems." International Conference on Machine Learning, 2005. doi:10.1145/1102351.1102466Markdown
[Tsang et al. "Core Vector Regression for Very Large Regression Problems." International Conference on Machine Learning, 2005.](https://mlanthology.org/icml/2005/tsang2005icml-core/) doi:10.1145/1102351.1102466BibTeX
@inproceedings{tsang2005icml-core,
title = {{Core Vector Regression for Very Large Regression Problems}},
author = {Tsang, Ivor W. and Kwok, James T. and Lai, Kimo T.},
booktitle = {International Conference on Machine Learning},
year = {2005},
pages = {912-919},
doi = {10.1145/1102351.1102466},
url = {https://mlanthology.org/icml/2005/tsang2005icml-core/}
}