An Improved Admissible Heuristic for Learning Optimal Bayesian Networks

Abstract

Recently two search algorithms, A* and breadth-first branch and bound (BFBnB), were developed based on a simple admissible heuristic for learning Bayesian network structures that optimize a scoring function. The heuristic represents a relaxation of the learning problem such that each variable chooses optimal parents independently. As a result, the heuristic may contain many directed cycles and result in a loose bound. This paper introduces an improved admissible heuristic that tries to avoid directed cycles within small groups of variables. A sparse representation is also introduced to store only the unique optimal parent choices. Empirical results show that the new techniques significantly improved the efficiency and scalability of A* and BFBnB on most of datasets tested in this paper.

Cite

Text

Yuan and Malone. "An Improved Admissible Heuristic for Learning Optimal Bayesian Networks." Conference on Uncertainty in Artificial Intelligence, 2012.

Markdown

[Yuan and Malone. "An Improved Admissible Heuristic for Learning Optimal Bayesian Networks." Conference on Uncertainty in Artificial Intelligence, 2012.](https://mlanthology.org/uai/2012/yuan2012uai-improved/)

BibTeX

@inproceedings{yuan2012uai-improved,
  title     = {{An Improved Admissible Heuristic for Learning Optimal Bayesian Networks}},
  author    = {Yuan, Changhe and Malone, Brandon M.},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {2012},
  pages     = {924-933},
  url       = {https://mlanthology.org/uai/2012/yuan2012uai-improved/}
}