On Neural Networks as Infinite Tree-Structured Probabilistic Graphical Models

Abstract

Deep neural networks (DNNs) lack the precise semantics and definitive probabilistic interpretation of probabilistic graphical models (PGMs). In this paper, we propose an innovative solution by constructing infinite tree-structured PGMs that correspond exactly to neural networks. Our research reveals that DNNs, during forward propagation, indeed perform approximations of PGM inference that are precise in this alternative PGM structure. Not only does our research complement existing studies that describe neural networks as kernel machines or infinite-sized Gaussian processes, it also elucidates a more direct approximation that DNNs make to exact inference in PGMs. Potential benefits include improved pedagogy and interpretation of DNNs, and algorithms that can merge the strengths of PGMs and DNNs.

Cite

Text

Li et al. "On Neural Networks as Infinite Tree-Structured Probabilistic Graphical Models." Neural Information Processing Systems, 2024. doi:10.52202/079017-0150

Markdown

[Li et al. "On Neural Networks as Infinite Tree-Structured Probabilistic Graphical Models." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/li2024neurips-neural/) doi:10.52202/079017-0150

BibTeX

@inproceedings{li2024neurips-neural,
  title     = {{On Neural Networks as Infinite Tree-Structured Probabilistic Graphical Models}},
  author    = {Li, Boyao and Thomson, Alexander J. and Nassif, Houssam and Engelhard, Matthew M. and Page, David},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0150},
  url       = {https://mlanthology.org/neurips/2024/li2024neurips-neural/}
}