An Efficient Method for Simplifying Support Vector Machines

Abstract

In this paper we describe a new method to reduce the complexity of support vector machines by reducing the number of necessary support vectors included in their solutions. The reduction process iteratively selects two nearest support vectors belonging to the same class and replaces them by a newly constructed vector. Through the analysis of relation between vectors in the input and feature spaces, we present the construction of new vectors that requires to find the unique maximum point of a one-variable function on the interval (0, 1), not to minimize a function of many variables with local minimums in former reduced set methods. Experimental results on real life datasets show that the proposed method is effective in reducing number of support vectors and preserving machine's generalization performance.

Cite

Text

Nguyen and Ho. "An Efficient Method for Simplifying Support Vector Machines." International Conference on Machine Learning, 2005. doi:10.1145/1102351.1102429

Markdown

[Nguyen and Ho. "An Efficient Method for Simplifying Support Vector Machines." International Conference on Machine Learning, 2005.](https://mlanthology.org/icml/2005/nguyen2005icml-efficient/) doi:10.1145/1102351.1102429

BibTeX

@inproceedings{nguyen2005icml-efficient,
  title     = {{An Efficient Method for Simplifying Support Vector Machines}},
  author    = {Nguyen, DucDung and Ho, Tu Bao},
  booktitle = {International Conference on Machine Learning},
  year      = {2005},
  pages     = {617-624},
  doi       = {10.1145/1102351.1102429},
  url       = {https://mlanthology.org/icml/2005/nguyen2005icml-efficient/}
}