Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network

Abstract

We examine asymptotic approximations for the marginal likelihood of incomplete data given a Bayesian network. We consider the Laplace approximation and the less accurate but more efficient BIC/MDL approximation. We also consider approximations proposed by Draper (1993) and Cheeseman and Stutz (1995). These approximations are as efficient as BIC/MDL, but their accuracy has not been studied in any depth. We compare the accuracy of these approximations under the assumption that the Laplace approximation is the most accurate. In experiments using synthetic data generated from discrete naive-Bayes models having a hidden root node, we find that (1) the BIC/MDL measure is the least accurate, having a bias in favor of simple models, and (2) the Draper and CS measures are the most accurate, having a bias in favor of simple and complex models, respectively.

Cite

Text

Chickering and Heckerman. "Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network." Conference on Uncertainty in Artificial Intelligence, 1996.

Markdown

[Chickering and Heckerman. "Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network." Conference on Uncertainty in Artificial Intelligence, 1996.](https://mlanthology.org/uai/1996/chickering1996uai-efficient/)

BibTeX

@inproceedings{chickering1996uai-efficient,
  title     = {{Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network}},
  author    = {Chickering, David Maxwell and Heckerman, David},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {1996},
  pages     = {158-168},
  url       = {https://mlanthology.org/uai/1996/chickering1996uai-efficient/}
}