Optimal Feature Selection for Decision Robustness in Bayesian Networks
Abstract
In many applications, one can define a large set of features to support the classification task at hand. At test time, however, these become prohibitively expensive to evaluate, and only a small subset of features is used, often selected for their information-theoretic value. For threshold-based, Naive Bayes classifiers, recent work has suggested selecting features that maximize the expected robustness of the classifier, that is, the expected probability it maintains its decision after seeing more features. We propose the first algorithm to compute this expected same-decision probability for general Bayesian network classifiers, based on compiling the network into a tractable circuit representation. Moreover, we develop a search algorithm for optimal feature selection that utilizes efficient incremental circuit modifications. Experiments on Naive Bayes, as well as more general networks, show the efficacy and distinct behavior of this decision-making approach.
Cite
Text
Choi et al. "Optimal Feature Selection for Decision Robustness in Bayesian Networks." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/215Markdown
[Choi et al. "Optimal Feature Selection for Decision Robustness in Bayesian Networks." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/choi2017ijcai-optimal/) doi:10.24963/IJCAI.2017/215BibTeX
@inproceedings{choi2017ijcai-optimal,
title = {{Optimal Feature Selection for Decision Robustness in Bayesian Networks}},
author = {Choi, YooJung and Darwiche, Adnan and Van den Broeck, Guy},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2017},
pages = {1554-1560},
doi = {10.24963/IJCAI.2017/215},
url = {https://mlanthology.org/ijcai/2017/choi2017ijcai-optimal/}
}