Multiplicative Updates for Nonnegative Quadratic Programming in Support Vector Machines

Abstract

We derive multiplicative updates for solving the nonnegative quadratic programming problem in support vector machines (SVMs). The updates have a simple closed form, and we prove that they converge monotoni- cally to the solution of the maximum margin hyperplane. The updates optimize the traditionally proposed objective function for SVMs. They do not involve any heuristics such as choosing a learning rate or deciding which variables to update at each iteration. They can be used to adjust all the quadratic programming variables in parallel with a guarantee of im- provement at each iteration. We analyze the asymptotic convergence of the updates and show that the coefficients of non-support vectors decay geometrically to zero at a rate that depends on their margins. In practice, the updates converge very rapidly to good classifiers.

Cite

Text

Sha et al. "Multiplicative Updates for Nonnegative Quadratic Programming in Support Vector Machines." Neural Information Processing Systems, 2002.

Markdown

[Sha et al. "Multiplicative Updates for Nonnegative Quadratic Programming in Support Vector Machines." Neural Information Processing Systems, 2002.](https://mlanthology.org/neurips/2002/sha2002neurips-multiplicative/)

BibTeX

@inproceedings{sha2002neurips-multiplicative,
  title     = {{Multiplicative Updates for Nonnegative Quadratic Programming in Support Vector Machines}},
  author    = {Sha, Fei and Saul, Lawrence K. and Lee, Daniel D.},
  booktitle = {Neural Information Processing Systems},
  year      = {2002},
  pages     = {1065-1072},
  url       = {https://mlanthology.org/neurips/2002/sha2002neurips-multiplicative/}
}