Generalized Evidence Pre-Propagated Importance Sampling for Hybrid Bayesian Networks
Abstract
In this paper, we first provide a new theoretical un-derstanding of the Evidence Pre-propagated Importance Sampling algorithm (EPIS-BN) (Yuan & Druzdzel 2003; 2006b) and show that its importance function minimizes the KL-divergence between the function it-self and the exact posterior probability distribution in Polytrees. We then generalize the method to deal with inference in general hybrid Bayesian networks consist-ing of deterministic equations and arbitrary probability distributions. Using a novel technique called soft arc reversal, the new algorithm can also handle evidential reasoning with observed deterministic variables.
Cite
Text
Yuan and Druzdzel. "Generalized Evidence Pre-Propagated Importance Sampling for Hybrid Bayesian Networks." AAAI Conference on Artificial Intelligence, 2007.Markdown
[Yuan and Druzdzel. "Generalized Evidence Pre-Propagated Importance Sampling for Hybrid Bayesian Networks." AAAI Conference on Artificial Intelligence, 2007.](https://mlanthology.org/aaai/2007/yuan2007aaai-generalized/)BibTeX
@inproceedings{yuan2007aaai-generalized,
title = {{Generalized Evidence Pre-Propagated Importance Sampling for Hybrid Bayesian Networks}},
author = {Yuan, Changhe and Druzdzel, Marek J.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2007},
pages = {1296-1303},
url = {https://mlanthology.org/aaai/2007/yuan2007aaai-generalized/}
}