Global Conditioning for Probabilistic Inference in Belief Networks
Abstract
In this paper we propose a new approach to probabilistic inference on belief networks, global conditioning, which is a simple generalization of Pearl's (1986b) method of loop-cutset conditioning. We show that global conditioning, as well as loop-cutset conditioning, can be thought of as a special case of the method of Lauritzen and Spiegelhalter (1988) as refined by Jensen et al (1990a; 1990b). Nonetheless, this approach provides new opportunities for parallel processing and, in the case of sequential processing, a tradeoff of time for memory. We also show how a hybrid method (Suermondt and others 1990) combining loop-cutset conditioning with Jensen's method can be viewed within our framework. By exploring the relationships between these methods, we develop a unifying framework in which the advantages of each approach can be combined successfully.
Cite
Text
Shachter et al. "Global Conditioning for Probabilistic Inference in Belief Networks." Conference on Uncertainty in Artificial Intelligence, 1994. doi:10.1016/B978-1-55860-332-5.50070-5Markdown
[Shachter et al. "Global Conditioning for Probabilistic Inference in Belief Networks." Conference on Uncertainty in Artificial Intelligence, 1994.](https://mlanthology.org/uai/1994/shachter1994uai-global/) doi:10.1016/B978-1-55860-332-5.50070-5BibTeX
@inproceedings{shachter1994uai-global,
title = {{Global Conditioning for Probabilistic Inference in Belief Networks}},
author = {Shachter, Ross D. and Andersen, Stig K. and Szolovits, Peter},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {1994},
pages = {514-522},
doi = {10.1016/B978-1-55860-332-5.50070-5},
url = {https://mlanthology.org/uai/1994/shachter1994uai-global/}
}