Learnability of Parameter-Bounded Bayes Nets
Abstract
Bayes nets are extensively used in practice to efficiently represent joint probability distributions over a set of random variables and capture dependency relations. In a seminal paper, Chickering et al. (JMLR 2004) showed that given a distribution P, that is defined as the marginal distribution of a Bayes net, it is NP-hard to decide whether there is a parameter-bounded Bayes net that represents P. They called this problem LEARN. In this work, we extend the NP-hardness result of LEARN and prove the NP-hardness of a promise search variant of LEARN, whereby the Bayes net in question is guaranteed to exist and one is asked to find such a Bayes net. We complement our hardness result with a positive result about the sample complexity that is sufficient to recover a parameter-bounded Bayes net that is close (in TV distance) to a given distribution P, represented by some parameter-bounded Bayes net, thereby generalizing a degree-bounded sample complexity result of Brustle et al. (EC 2020).
Cite
Text
Bhattacharyya et al. "Learnability of Parameter-Bounded Bayes Nets." ICML 2024 Workshops: SPIGM, 2024.Markdown
[Bhattacharyya et al. "Learnability of Parameter-Bounded Bayes Nets." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/bhattacharyya2024icmlw-learnability/)BibTeX
@inproceedings{bhattacharyya2024icmlw-learnability,
title = {{Learnability of Parameter-Bounded Bayes Nets}},
author = {Bhattacharyya, Arnab and Choo, Davin and Gayen, Sutanu and Myrisiotis, Dimitrios},
booktitle = {ICML 2024 Workshops: SPIGM},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/bhattacharyya2024icmlw-learnability/}
}