Learning Bayesian Networks with Ancestral Constraints
Abstract
We consider the problem of learning Bayesian networks optimally, when subject to background knowledge in the form of ancestral constraints. Our approach is based on a recently proposed framework for optimal structure learning based on non-decomposable scores, which is general enough to accommodate ancestral constraints. The proposed framework exploits oracles for learning structures using decomposable scores, which cannot accommodate ancestral constraints since they are non-decomposable. We show how to empower these oracles by passing them decomposable constraints that they can handle, which are inferred from ancestral constraints that they cannot handle. Empirically, we demonstrate that our approach can be orders-of-magnitude more efficient than alternative frameworks, such as those based on integer linear programming.
Cite
Text
Chen et al. "Learning Bayesian Networks with Ancestral Constraints." Neural Information Processing Systems, 2016.Markdown
[Chen et al. "Learning Bayesian Networks with Ancestral Constraints." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/chen2016neurips-learning/)BibTeX
@inproceedings{chen2016neurips-learning,
title = {{Learning Bayesian Networks with Ancestral Constraints}},
author = {Chen, Eunice Yuh-Jie and Shen, Yujia and Choi, Arthur and Darwiche, Adnan},
booktitle = {Neural Information Processing Systems},
year = {2016},
pages = {2325-2333},
url = {https://mlanthology.org/neurips/2016/chen2016neurips-learning/}
}