Sparse Online Greedy Support Vector Regression
Abstract
We present a novel algorithm for sparse online greedy kernelbased nonlinear regression. This algorithm improves current approaches to kernel-based regression in two aspects. First, it operates online - at each time step it observes a single new input sample, performs an update and discards it. Second, the solution maintained is extremely sparse. This is achieved by an explicit greedy sparsi.cation process that admits into the kernel representation a new input sample only if its feature space image is linearly independent of the images of previously admitted samples. We show that the algorithm implements a form of gradient ascent and demonstrate its scaling and noise tolerance properties on three benchmark regression problems.
Cite
Text
Engel et al. "Sparse Online Greedy Support Vector Regression." European Conference on Machine Learning, 2002. doi:10.1007/3-540-36755-1_8Markdown
[Engel et al. "Sparse Online Greedy Support Vector Regression." European Conference on Machine Learning, 2002.](https://mlanthology.org/ecmlpkdd/2002/engel2002ecml-sparse/) doi:10.1007/3-540-36755-1_8BibTeX
@inproceedings{engel2002ecml-sparse,
title = {{Sparse Online Greedy Support Vector Regression}},
author = {Engel, Yaakov and Mannor, Shie and Meir, Ron},
booktitle = {European Conference on Machine Learning},
year = {2002},
pages = {84-96},
doi = {10.1007/3-540-36755-1_8},
url = {https://mlanthology.org/ecmlpkdd/2002/engel2002ecml-sparse/}
}