Efficient Approximation of Neural Population Structure and Correlations with Probabilistic Circuits

Abstract

We present a computationally efficient framework to model a wide range of population structures with high order correlations and a large number of neurons. Our method is based on a special type of Bayesian network that has linear inference time and is founded upon the concept of contextual independence. Moreover, we use an efficient architecture learning method for network selection to model large neural populations even with a small amount of data. Our framework is both fast and accurate in approximating neural population structures. Furthermore, our approach enables us to reliably quantify higher order neural correlations. We test our method on simulated neural populations commonly used to generate higher order correlations, as well as on publicly available large-scale neural recordings from the Allen Brain Observatory. Our approach significantly outperforms other models both in terms of statistical measures and alignment with experimental evidence.

Cite

Text

Khalvati et al. "Efficient Approximation of Neural Population Structure and Correlations with Probabilistic Circuits." International Conference on Learning Representations, 2023.

Markdown

[Khalvati et al. "Efficient Approximation of Neural Population Structure and Correlations with Probabilistic Circuits." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/khalvati2023iclr-efficient/)

BibTeX

@inproceedings{khalvati2023iclr-efficient,
  title     = {{Efficient Approximation of Neural Population Structure and Correlations with Probabilistic Circuits}},
  author    = {Khalvati, Koosha and Johnson, Samantha and Mihalas, Stefan and Buice, Michael A},
  booktitle = {International Conference on Learning Representations},
  year      = {2023},
  url       = {https://mlanthology.org/iclr/2023/khalvati2023iclr-efficient/}
}