On the Parameterized Complexity of Polytree Learning

Abstract

A Bayesian network is a directed acyclic graph that represents statistical dependencies between variables of a joint probability distribution. A fundamental task in data science is to learn a Bayesian network from observed data. Polytree Learning is the problem of learning an optimal Bayesian network that fulfills the additional property that its underlying undirected graph is a forest. In this work, we revisit the complexity of Polytree Learning. We show that Polytree Learning can be solved in single-exponential FPT time for the number of variables. Moreover, we consider the influence of d, the number of variables that might receive a nonempty parent set in the final DAG on the complexity of Polytree Learning. We show that Polytree Learning is presumably not fixed-parameter tractable for d, unlike Bayesian network learning which is fixed-parameter tractable for d. Finally, we show that if d and the maximum parent set size are bounded, then we can obtain efficient algorithms.

Cite

Text

Grüttemeier et al. "On the Parameterized Complexity of Polytree Learning." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/580

Markdown

[Grüttemeier et al. "On the Parameterized Complexity of Polytree Learning." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/gruttemeier2021ijcai-parameterized/) doi:10.24963/IJCAI.2021/580

BibTeX

@inproceedings{gruttemeier2021ijcai-parameterized,
  title     = {{On the Parameterized Complexity of Polytree Learning}},
  author    = {Grüttemeier, Niels and Komusiewicz, Christian and Morawietz, Nils},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {4221-4227},
  doi       = {10.24963/IJCAI.2021/580},
  url       = {https://mlanthology.org/ijcai/2021/gruttemeier2021ijcai-parameterized/}
}