Bayesian Learning of Markov Network Structure

Abstract

We propose a simple and efficient approach to building undirected probabilistic classification models (Markov networks) that extend naïve Bayes classifiers and outperform existing directed probabilistic classifiers (Bayesian networks) of similar complexity. Our Markov network model is represented as a set of consistent probability distributions on subsets of variables. Inference with such a model can be done efficiently in closed form for problems like class probability estimation. We also propose a highly efficient Bayesian structure learning algorithm for conditional prediction problems, based on integrating along a hill-climb in the structure space. Our prior based on the degrees of freedom effectively prevents overfitting.

Cite

Text

Jakulin and Rish. "Bayesian Learning of Markov Network Structure." European Conference on Machine Learning, 2006. doi:10.1007/11871842_22

Markdown

[Jakulin and Rish. "Bayesian Learning of Markov Network Structure." European Conference on Machine Learning, 2006.](https://mlanthology.org/ecmlpkdd/2006/jakulin2006ecml-bayesian/) doi:10.1007/11871842_22

BibTeX

@inproceedings{jakulin2006ecml-bayesian,
  title     = {{Bayesian Learning of Markov Network Structure}},
  author    = {Jakulin, Aleks and Rish, Irina},
  booktitle = {European Conference on Machine Learning},
  year      = {2006},
  pages     = {198-209},
  doi       = {10.1007/11871842_22},
  url       = {https://mlanthology.org/ecmlpkdd/2006/jakulin2006ecml-bayesian/}
}