Equivariant Transformers for Neural Network Based Molecular Potentials

Abstract

The prediction of quantum mechanical properties is historically plagued by a trade-off between accuracy and speed. Machine learning potentials have previously shown great success in this domain, reaching increasingly better accuracy while maintaining computational efficiency comparable with classical force fields. In this work we propose TorchMD-NET, a novel equivariant Transformer (ET) architecture, outperforming state-of-the-art on MD17, ANI-1, and many QM9 targets in both accuracy and computational efficiency. Through an extensive attention weight analysis, we gain valuable insights into the black box predictor and show differences in the learned representation of conformers versus conformations sampled from molecular dynamics or normal modes. Furthermore, we highlight the importance of datasets including off-equilibrium conformations for the evaluation of molecular potentials.

Cite

Text

Thölke and De Fabritiis. "Equivariant Transformers for Neural Network Based Molecular Potentials." International Conference on Learning Representations, 2022.

Markdown

[Thölke and De Fabritiis. "Equivariant Transformers for Neural Network Based Molecular Potentials." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/tholke2022iclr-equivariant/)

BibTeX

@inproceedings{tholke2022iclr-equivariant,
  title     = {{Equivariant Transformers for Neural Network Based Molecular Potentials}},
  author    = {Thölke, Philipp and De Fabritiis, Gianni},
  booktitle = {International Conference on Learning Representations},
  year      = {2022},
  url       = {https://mlanthology.org/iclr/2022/tholke2022iclr-equivariant/}
}