A Feature Selection Algorithm Based on the Global Minimization of a Generalization Error Bound
Abstract
A novel linear feature selection algorithm is presented based on the global minimization of a data-dependent generalization error bound. Feature selection and scaling algorithms often lead to non-convex opti- mization problems, which in many previous approaches were addressed through gradient descent procedures that can only guarantee convergence to a local minimum. We propose an alternative approach, whereby the global solution of the non-convex optimization problem is derived via an equivalent optimization problem. Moreover, the convex optimization task is reduced to a conic quadratic programming problem for which effi- cient solvers are available. Highly competitive numerical results on both artificial and real-world data sets are reported.
Cite
Text
Peleg and Meir. "A Feature Selection Algorithm Based on the Global Minimization of a Generalization Error Bound." Neural Information Processing Systems, 2004.Markdown
[Peleg and Meir. "A Feature Selection Algorithm Based on the Global Minimization of a Generalization Error Bound." Neural Information Processing Systems, 2004.](https://mlanthology.org/neurips/2004/peleg2004neurips-feature/)BibTeX
@inproceedings{peleg2004neurips-feature,
title = {{A Feature Selection Algorithm Based on the Global Minimization of a Generalization Error Bound}},
author = {Peleg, Dori and Meir, Ron},
booktitle = {Neural Information Processing Systems},
year = {2004},
pages = {1065-1072},
url = {https://mlanthology.org/neurips/2004/peleg2004neurips-feature/}
}