Improving the Accuracy and Speed of Support Vector Machines
Abstract
Support Vector Learning Machines (SVM) are finding application in pattern recognition, regression estimation , and operator inver(cid:173) sion for ill-posed problems. Against this very general backdrop , any methods for improving the generalization performance, or for improving the speed in test phase, of SVMs are of increasing in(cid:173) terest. In this paper we combine two such techniques on a pattern recognition problem. The method for improving generalization per(cid:173) formance (the "virtual support vector" method) does so by incor(cid:173) porating known invariances of the problem. This method achieves a drop in the error rate on 10,000 NIST test digit images of 1.4% to 1.0%. The method for improving the speed (the "reduced set" method) does so by approximating the support vector decision sur(cid:173) face. We apply this method to achieve a factor of fifty speedup in test phase over the virtual support vector machine. The combined approach yields a machine which is both 22 times faster than the original machine, and which has better generalization performance, achieving 1.1 % error. The virtual support vector method is appli(cid:173) cable to any SVM problem with known invariances. The reduced set method is applicable to any support vector machine.
Cite
Text
Burges and Schölkopf. "Improving the Accuracy and Speed of Support Vector Machines." Neural Information Processing Systems, 1996.Markdown
[Burges and Schölkopf. "Improving the Accuracy and Speed of Support Vector Machines." Neural Information Processing Systems, 1996.](https://mlanthology.org/neurips/1996/burges1996neurips-improving/)BibTeX
@inproceedings{burges1996neurips-improving,
title = {{Improving the Accuracy and Speed of Support Vector Machines}},
author = {Burges, Christopher J. C. and Schölkopf, Bernhard},
booktitle = {Neural Information Processing Systems},
year = {1996},
pages = {375-381},
url = {https://mlanthology.org/neurips/1996/burges1996neurips-improving/}
}