Feature Selection for Ensembles

Abstract

The traditional motivation behind feature selection algorithms is to find the best subset of features for a task using one particular learning algorithm. Given the recent success of ensembles, however, we investigate the notion of ensemble feature selection in this paper. This task is harder than traditional feature selection in that one not only needs to find features germane to the learning task and learning algorithm, but one also needs to find a set of feature subsets that will promote disagreement among the ensemble's classifiers. In this paper, we present an ensemble feature selection approach that is based on genetic algorithms. Our algorithm shows improved performance over the popular and powerful ensemble approaches of AdaBoost and Bagging and demonstrates the utility of ensemble feature selection. Introduction Feature selection algorithms attempt to find and remove the features which are unhelpful or destructive to learning (Almuallim & Dietterich 1994; Cherkauer & Shavlik 1...

Cite

Text

Opitz. "Feature Selection for Ensembles." AAAI Conference on Artificial Intelligence, 1999.

Markdown

[Opitz. "Feature Selection for Ensembles." AAAI Conference on Artificial Intelligence, 1999.](https://mlanthology.org/aaai/1999/opitz1999aaai-feature/)

BibTeX

@inproceedings{opitz1999aaai-feature,
  title     = {{Feature Selection for Ensembles}},
  author    = {Opitz, David W.},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {1999},
  pages     = {379-384},
  url       = {https://mlanthology.org/aaai/1999/opitz1999aaai-feature/}
}