Learning Bayesian Networks with Restricted Causal Interactions

Abstract

A major problem for the learning of Bayesian networks (BNs) is the exponential number of parameters needed for conditional probability tables. Recent research reduces this complexity by modeling local structure in the probability tables. We examine the use of log-linear local models. While log-linear models in this context are not new (Whittaker, 1990; Buntine, 1991; Neal, 1992; Heckerman and Meek, 1997), it is generally subsumed under a naive Bayes model. We describe an alternative using a Minimum Message Length (MML) (Wallace and Freeman, 1987) metric for the selection of local models with causal independence, which we term a first-order model (FOM). We also combine FOMs and full conditional models on a node-by-node basis.

Cite

Text

Neil et al. "Learning Bayesian Networks with Restricted Causal Interactions." Conference on Uncertainty in Artificial Intelligence, 1999.

Markdown

[Neil et al. "Learning Bayesian Networks with Restricted Causal Interactions." Conference on Uncertainty in Artificial Intelligence, 1999.](https://mlanthology.org/uai/1999/neil1999uai-learning/)

BibTeX

@inproceedings{neil1999uai-learning,
  title     = {{Learning Bayesian Networks with Restricted Causal Interactions}},
  author    = {Neil, Julian R. and Wallace, Chris S. and Korb, Kevin B.},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {1999},
  pages     = {486-493},
  url       = {https://mlanthology.org/uai/1999/neil1999uai-learning/}
}