On Robust Trimming of Bayesian Network Classifiers
Abstract
This paper considers the problem of removing costly features from a Bayesian network classifier. We want the classifier to be robust to these changes, and maintain its classification behavior. To this end, we propose a closeness metric between Bayesian classifiers, called the expected classification agreement (ECA). Our corresponding trimming algorithm finds an optimal subset of features and a new classification threshold that maximize the expected agreement, subject to a budgetary constraint. It utilizes new theoretical insights to perform branch-and-bound search in the space of feature sets, while computing bounds on the ECA. Our experiments investigate both the runtime cost of trimming and its effect on the robustness and accuracy of the final classifier.
Cite
Text
Choi and Van den Broeck. "On Robust Trimming of Bayesian Network Classifiers." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/694Markdown
[Choi and Van den Broeck. "On Robust Trimming of Bayesian Network Classifiers." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/choi2018ijcai-robust/) doi:10.24963/IJCAI.2018/694BibTeX
@inproceedings{choi2018ijcai-robust,
title = {{On Robust Trimming of Bayesian Network Classifiers}},
author = {Choi, YooJung and Van den Broeck, Guy},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2018},
pages = {5002-5009},
doi = {10.24963/IJCAI.2018/694},
url = {https://mlanthology.org/ijcai/2018/choi2018ijcai-robust/}
}