Probabilistic Neural Circuits

Abstract

Probabilistic circuits (PCs) have gained prominence in recent years as a versatile framework for discussing probabilistic models that support tractable queries and are yet expressive enough to model complex probability distributions. Nevertheless, tractability comes at a cost: PCs are less expressive than neural networks. In this paper we introduce probabilistic neural circuits (PNCs), which strike a balance between PCs and neural nets in terms of tractability and expressive power. Theoretically, we show that PNCs can be interpreted as deep mixtures of Bayesian networks. Experimentally, we demonstrate that PNCs constitute powerful function approximators.

Cite

Text

Dos Martires. "Probabilistic Neural Circuits." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I15.29675

Markdown

[Dos Martires. "Probabilistic Neural Circuits." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/martires2024aaai-probabilistic/) doi:10.1609/AAAI.V38I15.29675

BibTeX

@inproceedings{martires2024aaai-probabilistic,
  title     = {{Probabilistic Neural Circuits}},
  author    = {Dos Martires, Pedro Zuidberg},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {17280-17289},
  doi       = {10.1609/AAAI.V38I15.29675},
  url       = {https://mlanthology.org/aaai/2024/martires2024aaai-probabilistic/}
}