Incremental and Decremental Support Vector Machine Learning

Abstract

An on-line recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the Kuhn(cid:173) Tucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is re(cid:173) versible, and decremental "unlearning" offers an efficient method to ex(cid:173) actly evaluate leave-one-out generalization performance. Interpretation of decremental unlearning in feature space sheds light on the relationship between generalization and geometry of the data.

Cite

Text

Cauwenberghs and Poggio. "Incremental and Decremental Support Vector Machine Learning." Neural Information Processing Systems, 2000.

Markdown

[Cauwenberghs and Poggio. "Incremental and Decremental Support Vector Machine Learning." Neural Information Processing Systems, 2000.](https://mlanthology.org/neurips/2000/cauwenberghs2000neurips-incremental/)

BibTeX

@inproceedings{cauwenberghs2000neurips-incremental,
  title     = {{Incremental and Decremental Support Vector Machine Learning}},
  author    = {Cauwenberghs, Gert and Poggio, Tomaso},
  booktitle = {Neural Information Processing Systems},
  year      = {2000},
  pages     = {409-415},
  url       = {https://mlanthology.org/neurips/2000/cauwenberghs2000neurips-incremental/}
}