Equivariant Matrix Function Neural Networks

Abstract

Graph Neural Networks (GNNs), especially message-passing neural networks (MPNNs), have emerged as powerful architectures for learning on graphs in diverse applications. However, MPNNs face challenges when modeling non-local interactions in systems such as large conjugated molecules, metals, or amorphous materials. Although Spectral GNNs and traditional neural networks such as recurrent neural networks and transformers mitigate these challenges, they often lack extensivity, adaptability, generalizability, computational efficiency, or fail to capture detailed structural relationships or symmetries in the data. To address these concerns, we introduce Matrix Function Neural Networks (MFNs), a novel architecture that parameterizes non-local interactions through analytic matrix equivariant functions. Employing resolvent expansions offers a straightforward implementation and the potential for linear scaling with system size. The MFN architecture achieves state-of-the-art performance in standard graph benchmarks, such as the ZINC and TU datasets, and is able to capture intricate non-local interactions in quantum systems. The code and the datasets will be made public.

Cite

Text

Batatia et al. "Equivariant Matrix Function Neural Networks." International Conference on Learning Representations, 2024.

Markdown

[Batatia et al. "Equivariant Matrix Function Neural Networks." International Conference on Learning Representations, 2024.](https://mlanthology.org/iclr/2024/batatia2024iclr-equivariant/)

BibTeX

@inproceedings{batatia2024iclr-equivariant,
  title     = {{Equivariant Matrix Function Neural Networks}},
  author    = {Batatia, Ilyes and Schaaf, Lars Leon and Csanyi, Gabor and Ortner, Christoph and Faber, Felix Andreas},
  booktitle = {International Conference on Learning Representations},
  year      = {2024},
  url       = {https://mlanthology.org/iclr/2024/batatia2024iclr-equivariant/}
}