Jointly Informative Feature Selection
Abstract
We propose several novel criteria for the selection of groups of jointly informative continuous features in the context of classification. Our approach is based on combining a Gaussian modeling of the feature responses, with derived upper bounds on their mutual information with the class label and their joint entropy. We further propose specific algorithmic implementations of these criteria which reduce the computational complexity of the algorithms by up to two-orders of magnitude, making these strategies tractable in practice. Experiments on multiple computer-vision data-bases, and using several types of classifiers, show that this class of methods outperforms state-of-the-art baselines, both in terms of speed and classification accuracy.
Cite
Text
Lefakis and Fleuret. "Jointly Informative Feature Selection." International Conference on Artificial Intelligence and Statistics, 2014.Markdown
[Lefakis and Fleuret. "Jointly Informative Feature Selection." International Conference on Artificial Intelligence and Statistics, 2014.](https://mlanthology.org/aistats/2014/lefakis2014aistats-jointly/)BibTeX
@inproceedings{lefakis2014aistats-jointly,
title = {{Jointly Informative Feature Selection}},
author = {Lefakis, Leonidas and Fleuret, François},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2014},
pages = {567-575},
url = {https://mlanthology.org/aistats/2014/lefakis2014aistats-jointly/}
}