Attention for Causal Relationship Discovery from Biological Neural Dynamics

Abstract

This paper explores the potential of the transformer models for learning Granger causality in networks with complex nonlinear dynamics at every node, as in neurobiological and biophysical networks. Our study primarily focuses on a proof-of-concept investigation based on simulated neural dynamics, for which the ground-truth causality is known through the underlying connectivity matrix. For transformer models trained to forecast neuronal population dynamics, we show that the cross attention module effectively captures the causal relationship among neurons, with an accuracy equal or superior to that for the most popular Granger causality analysis method. While we acknowledge that real-world neurobiology data will bring further challenges, including dynamic connectivity and unobserved variability, this research offers an encouraging preliminary glimpse into the utility of the transformer model for causal representation learning in neuroscience.

Cite

Text

Lu et al. "Attention for Causal Relationship Discovery from Biological Neural Dynamics." NeurIPS 2023 Workshops: CRL, 2023.

Markdown

[Lu et al. "Attention for Causal Relationship Discovery from Biological Neural Dynamics." NeurIPS 2023 Workshops: CRL, 2023.](https://mlanthology.org/neuripsw/2023/lu2023neuripsw-attention/)

BibTeX

@inproceedings{lu2023neuripsw-attention,
  title     = {{Attention for Causal Relationship Discovery from Biological Neural Dynamics}},
  author    = {Lu, Ziyu and Tabassum, Anika and Kulkarn, Shruti R. and Mi, Lu and Kutz, J. Nathan and SheaBrown, Eric Todd and Lim, Seung-Hwan},
  booktitle = {NeurIPS 2023 Workshops: CRL},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/lu2023neuripsw-attention/}
}