On the Expressive Power of Tree-Structured Probabilistic Circuits

Abstract

Probabilistic circuits (PCs) have emerged as a powerful framework to compactly represent probability distributions for efficient and exact probabilistic inference. It has been shown that PCs with a general directed acyclic graph (DAG) structure can be understood as a mixture of exponentially (in its height) many components, each of which is a product distribution over univariate marginals. However, existing structure learning algorithms for PCs often generate tree-structured circuits or use tree-structured circuits as intermediate steps to compress them into DAG-structured circuits. This leads to the intriguing question of whether there exists an exponential gap between DAGs and trees for the PC structure. In this paper, we provide a negative answer to this conjecture by proving that, for $n$ variables, there exists a sub-exponential upper bound $n^{O(\log n)}$ on the size of an equivalent tree computing the same probability distribution. On the other hand, we also show that given a depth restriction on the tree, there is a super-polynomial separation between tree and DAG-structured PCs. Our work takes an important step towards understanding the expressive power of tree-structured PCs, and our techniques may be of independent interest in the study of structure learning algorithms for PCs.

Cite

Text

Yin and Zhao. "On the Expressive Power of Tree-Structured Probabilistic Circuits." ICML 2024 Workshops: SPIGM, 2024.

Markdown

[Yin and Zhao. "On the Expressive Power of Tree-Structured Probabilistic Circuits." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/yin2024icmlw-expressive/)

BibTeX

@inproceedings{yin2024icmlw-expressive,
  title     = {{On the Expressive Power of Tree-Structured Probabilistic Circuits}},
  author    = {Yin, Lang and Zhao, Han},
  booktitle = {ICML 2024 Workshops: SPIGM},
  year      = {2024},
  url       = {https://mlanthology.org/icmlw/2024/yin2024icmlw-expressive/}
}