Learning Additive Models Online with Fast Evaluating Kernels

Abstract

We develop three new techniques to build on the recent advances in online learning with kernels. First, we show that an exponential speed-up in prediction time per trial is possible for such algorithms as the Kernel-Adatron, the Kernel-Perceptron, and ROMMA for specific additive models. Second, we show that the techniques of the recent algorithms developed for online linear prediction when the best predictor changes over time may be implemented for kernel-based learners at no additional asymptotic cost. Finally, we introduce a new online kernelbased learning algorithm for which we give worst-case loss bounds for the ε-insensitive square loss.

Cite

Text

Herbster. "Learning Additive Models Online with Fast Evaluating Kernels." Annual Conference on Computational Learning Theory, 2001. doi:10.1007/3-540-44581-1_29

Markdown

[Herbster. "Learning Additive Models Online with Fast Evaluating Kernels." Annual Conference on Computational Learning Theory, 2001.](https://mlanthology.org/colt/2001/herbster2001colt-learning/) doi:10.1007/3-540-44581-1_29

BibTeX

@inproceedings{herbster2001colt-learning,
  title     = {{Learning Additive Models Online with Fast Evaluating Kernels}},
  author    = {Herbster, Mark},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2001},
  pages     = {444-460},
  doi       = {10.1007/3-540-44581-1_29},
  url       = {https://mlanthology.org/colt/2001/herbster2001colt-learning/}
}