Time–Approximation Trade-Offs for Learning Bayesian Networks

Abstract

Bayesian network structure learning is an NP-hard problem. Furthermore, the problem remains hard even for various subclasses of graphs. Motivated by the hardness of exact learning, we study approximation algorithms for learning Bayesian networks. First, we propose a moderately exponential time algorithm with running time $\mathcal{O}(2^{\frac{\ell}{r}n})$ that has an approximation ratio $\frac{\ell}{r}$ where $n$ is the number of vertices and $\ell$ and $r$ are user-defined parameters with $\ell\leq r$. That is, we give time–approximation trade-offs for learning Bayesian networks. Second, we present a polynomial time algorithm with an approximation ratio $\frac{1}{d}$ to find an optimal graph whose connected components have size at most $d$.

Cite

Text

Kundu et al. "Time–Approximation Trade-Offs for Learning Bayesian Networks." Proceedings of The 12th International Conference on Probabilistic Graphical Models, 2024.

Markdown

[Kundu et al. "Time–Approximation Trade-Offs for Learning Bayesian Networks." Proceedings of The 12th International Conference on Probabilistic Graphical Models, 2024.](https://mlanthology.org/pgm/2024/kundu2024pgm-timeapproximation/)

BibTeX

@inproceedings{kundu2024pgm-timeapproximation,
  title     = {{Time–Approximation Trade-Offs for Learning Bayesian Networks}},
  author    = {Kundu, Madhumita and Parviainen, Pekka and Saurabh, Saket},
  booktitle = {Proceedings of The 12th International Conference on Probabilistic Graphical Models},
  year      = {2024},
  pages     = {486-497},
  volume    = {246},
  url       = {https://mlanthology.org/pgm/2024/kundu2024pgm-timeapproximation/}
}