Branch and Bound for Regular Bayesian Network Structure Learing

Abstract

We consider efficient Bayesian network structure learning (BNSL) using branch and bound. Thus far, it is known that efficient pruning rules have been found for minimizing the description length (MDL) rather than for maximizing the posterior probability such as the Bayesian Dirichlet equivalent uniform (BDeu) that has been used most often. In any BNSL, given data, the simplest structure should be chosen if structures are equally fit to the data. We show that quotient scores satisfy such regularity while the BDeu does not. In this paper, by utilizing the regularity, we propose an pruning rule for maximizing the posterior probability based on the quotient scores, and prove that the bound is tighter than existing one. We examine by experiments that the proposed bound is applied significantly more often than existing one for the BDeu as well. Besides, we prove that the MDL satisfies regularity and that the proposed bound behaves like that of the MDL, and observe that the proposed bound is almost as efficient as one for the MDL. Our novel insight is that BNSL can be efficient by utilizing regularity for constructing a pruning rule.

Cite

Text

Suzuki and Kawahara. "Branch and Bound for Regular Bayesian Network Structure Learing." Conference on Uncertainty in Artificial Intelligence, 2017.

Markdown

[Suzuki and Kawahara. "Branch and Bound for Regular Bayesian Network Structure Learing." Conference on Uncertainty in Artificial Intelligence, 2017.](https://mlanthology.org/uai/2017/suzuki2017uai-branch/)

BibTeX

@inproceedings{suzuki2017uai-branch,
  title     = {{Branch and Bound for Regular Bayesian Network Structure Learing}},
  author    = {Suzuki, Joe and Kawahara, Jun},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {2017},
  url       = {https://mlanthology.org/uai/2017/suzuki2017uai-branch/}
}