Margin Based Feature Selection - Theory and Algorithms
Abstract
Feature selection is the task of choosing a small set out of a given set offeatures that capture the relevant properties of the data. In the context ofsupervised classification problems the relevance is determined by the givenlabels on the training data. A good choice of features is a key for buildingcompact and accurate classifiers. In this paper we introduce a margin based feature selection criterion andapply it to measure the quality of sets of features. Using margins we devisenovel selection algorithms for multi-class classification problems and providetheoretical generalization bound. We also study the well known Reliefalgorithm and show that it resembles a gradient ascent over our margincriterion. We apply our new algorithm to various datasets and show that ournew Simba algorithm, which directly optimizes the margin, outperforms Relief.
Cite
Text
Gilad-Bachrach et al. "Margin Based Feature Selection - Theory and Algorithms." International Conference on Machine Learning, 2004. doi:10.1145/1015330.1015352Markdown
[Gilad-Bachrach et al. "Margin Based Feature Selection - Theory and Algorithms." International Conference on Machine Learning, 2004.](https://mlanthology.org/icml/2004/giladbachrach2004icml-margin/) doi:10.1145/1015330.1015352BibTeX
@inproceedings{giladbachrach2004icml-margin,
title = {{Margin Based Feature Selection - Theory and Algorithms}},
author = {Gilad-Bachrach, Ran and Navot, Amir and Tishby, Naftali},
booktitle = {International Conference on Machine Learning},
year = {2004},
doi = {10.1145/1015330.1015352},
url = {https://mlanthology.org/icml/2004/giladbachrach2004icml-margin/}
}