Bounding the False Discovery Rate in Local Bayesian Network Learning
Abstract
Modern Bayesian Network learning algorithms are time-efficient, scalable and produce high-quality models; these al-gorithms feature prominently in decision support model de-velopment, variable selection, and causal discovery. The quality of the models, however, has often only been em-pirically evaluated; the available theoretical results typically guarantee asymptotic correctness (consistency) of the algo-rithms. This paper describes theoretical bounds on the quality of a fundamental Bayesian Network local-learning task in the finite sample using theories for controlling the False Discov-ery Rate. The behavior of the derived bounds is investigated across various problem and algorithm parameters. Empirical results support the theory which has immediate ramifications in the design of new algorithms for Bayesian Network learn-ing, variable selection and causal discovery.
Cite
Text
Tsamardinos and Brown. "Bounding the False Discovery Rate in Local Bayesian Network Learning." AAAI Conference on Artificial Intelligence, 2008.Markdown
[Tsamardinos and Brown. "Bounding the False Discovery Rate in Local Bayesian Network Learning." AAAI Conference on Artificial Intelligence, 2008.](https://mlanthology.org/aaai/2008/tsamardinos2008aaai-bounding/)BibTeX
@inproceedings{tsamardinos2008aaai-bounding,
title = {{Bounding the False Discovery Rate in Local Bayesian Network Learning}},
author = {Tsamardinos, Ioannis and Brown, Laura E.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2008},
pages = {1100-1105},
url = {https://mlanthology.org/aaai/2008/tsamardinos2008aaai-bounding/}
}