Incremental Learning and Selective Sampling via Parametric Optimization Framework for SVM
Abstract
We propose a framework based on a parametric quadratic program(cid:173) ming (QP) technique to solve the support vector machine (SVM) training problem. This framework, can be specialized to obtain two SVM optimization methods. The first solves the fixed bias prob(cid:173) lem, while the second starts with an optimal solution for a fixed bias problem and adjusts the bias until the optimal value is found. The later method can be applied in conjunction with any other ex(cid:173) isting technique which obtains a fixed bias solution. Moreover, the second method can also be used independently to solve the com(cid:173) plete SVM training problem. A combination of these two methods is more flexible than each individual method and, among other things, produces an incremental algorithm which exactly solve the 1-Norm Soft Margin SVM optimization problem. Applying Selec(cid:173) tive Sampling techniques may further boost convergence.
Cite
Text
Fine and Scheinberg. "Incremental Learning and Selective Sampling via Parametric Optimization Framework for SVM." Neural Information Processing Systems, 2001.Markdown
[Fine and Scheinberg. "Incremental Learning and Selective Sampling via Parametric Optimization Framework for SVM." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/fine2001neurips-incremental/)BibTeX
@inproceedings{fine2001neurips-incremental,
title = {{Incremental Learning and Selective Sampling via Parametric Optimization Framework for SVM}},
author = {Fine, Shai and Scheinberg, Katya},
booktitle = {Neural Information Processing Systems},
year = {2001},
pages = {705-711},
url = {https://mlanthology.org/neurips/2001/fine2001neurips-incremental/}
}