Choosing Multiple Parameters for Support Vector Machines
Abstract
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.
Cite
Text
Chapelle et al. "Choosing Multiple Parameters for Support Vector Machines." Machine Learning, 2002. doi:10.1023/A:1012450327387Markdown
[Chapelle et al. "Choosing Multiple Parameters for Support Vector Machines." Machine Learning, 2002.](https://mlanthology.org/mlj/2002/chapelle2002mlj-choosing/) doi:10.1023/A:1012450327387BibTeX
@article{chapelle2002mlj-choosing,
title = {{Choosing Multiple Parameters for Support Vector Machines}},
author = {Chapelle, Olivier and Vapnik, Vladimir and Bousquet, Olivier and Mukherjee, Sayan},
journal = {Machine Learning},
year = {2002},
pages = {131-159},
doi = {10.1023/A:1012450327387},
volume = {46},
url = {https://mlanthology.org/mlj/2002/chapelle2002mlj-choosing/}
}