The Forgetron: A Kernel-Based Perceptron on a Fixed Budget

Abstract

The Perceptron algorithm, despite its simplicity, often performs well on online classification tasks. The Perceptron becomes especially effective when it is used in conjunction with kernels. However, a common difficulty encountered when implementing kernel-based online algorithms is the amount of memory required to store the online hypothesis, which may grow unboundedly. In this paper we present and analyze the Forgetron algorithm for kernel-based online learning on a fixed memory budget. To our knowledge, this is the first online learning algorithm which, on one hand, maintains a strict limit on the number of examples it stores while, on the other hand, entertains a relative mistake bound. In addition to the formal results, we also present experiments with real datasets which underscore the merits of our approach.

Cite

Text

Dekel et al. "The Forgetron: A Kernel-Based Perceptron on a Fixed Budget." Neural Information Processing Systems, 2005.

Markdown

[Dekel et al. "The Forgetron: A Kernel-Based Perceptron on a Fixed Budget." Neural Information Processing Systems, 2005.](https://mlanthology.org/neurips/2005/dekel2005neurips-forgetron/)

BibTeX

@inproceedings{dekel2005neurips-forgetron,
  title     = {{The Forgetron: A Kernel-Based Perceptron on a Fixed Budget}},
  author    = {Dekel, Ofer and Shalev-shwartz, Shai and Singer, Yoram},
  booktitle = {Neural Information Processing Systems},
  year      = {2005},
  pages     = {259-266},
  url       = {https://mlanthology.org/neurips/2005/dekel2005neurips-forgetron/}
}