Implicit Online Learning with Kernels
Abstract
We present two new algorithms for online learning in reproducing kernel Hilbert spaces. Our first algorithm, ILK (implicit online learning with kernels), employs a new, implicit update technique that can be applied to a wide variety of convex loss functions. We then introduce a bounded memory version, SILK (sparse ILK), that maintains a compact representation of the predictor without compromising solution quality, even in non-stationary environments. We prove loss bounds and analyze the convergence rate of both. Experimental evidence shows that our proposed algorithms outperform current methods on synthetic and real data.
Cite
Text
Cheng et al. "Implicit Online Learning with Kernels." Neural Information Processing Systems, 2006.Markdown
[Cheng et al. "Implicit Online Learning with Kernels." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/cheng2006neurips-implicit/)BibTeX
@inproceedings{cheng2006neurips-implicit,
title = {{Implicit Online Learning with Kernels}},
author = {Cheng, Li and Schuurmans, Dale and Wang, Shaojun and Caelli, Terry and Vishwanathan, S.v.n.},
booktitle = {Neural Information Processing Systems},
year = {2006},
pages = {249-256},
url = {https://mlanthology.org/neurips/2006/cheng2006neurips-implicit/}
}