Sequentially Fitting ``Inclusive'' Trees for Inference in Noisy-or Networks
Abstract
An important class of problems can be cast as inference in noisy(cid:173) OR Bayesian networks, where the binary state of each variable is a logical OR of noisy versions of the states of the variable's par(cid:173) ents. For example, in medical diagnosis, the presence of a symptom can be expressed as a noisy-OR of the diseases that may cause the symptom - on some occasions, a disease may fail to activate the symptom. Inference in richly-connected noisy-OR networks is in(cid:173) tractable, but approximate methods (e .g., variational techniques) are showing increasing promise as practical solutions. One prob(cid:173) lem with most approximations is that they tend to concentrate on a relatively small number of modes in the true posterior, ig(cid:173) noring other plausible configurations of the hidden variables. We introduce a new sequential variational method for bipartite noisy(cid:173) OR networks, that favors including all modes of the true posterior and models the posterior distribution as a tree. We compare this method with other approximations using an ensemble of networks with network statistics that are comparable to the QMR-DT med(cid:173) ical diagnostic network.
Cite
Text
Frey et al. "Sequentially Fitting ``Inclusive'' Trees for Inference in Noisy-or Networks." Neural Information Processing Systems, 2000.Markdown
[Frey et al. "Sequentially Fitting ``Inclusive'' Trees for Inference in Noisy-or Networks." Neural Information Processing Systems, 2000.](https://mlanthology.org/neurips/2000/frey2000neurips-sequentially/)BibTeX
@inproceedings{frey2000neurips-sequentially,
title = {{Sequentially Fitting ``Inclusive'' Trees for Inference in Noisy-or Networks}},
author = {Frey, Brendan J. and Patrascu, Relu and Jaakkola, Tommi and Moran, Jodi},
booktitle = {Neural Information Processing Systems},
year = {2000},
pages = {493-499},
url = {https://mlanthology.org/neurips/2000/frey2000neurips-sequentially/}
}