Molecular Energy Learning Using Alternative Blackbox Matrix-Matrix Multiplication Algorithm for Exact Gaussian Process

Abstract

We present an application of the blackbox matrix-matrix multiplication (BBMM) algorithm to scale up the Gaussian Process (GP) training of molecular energies in the molecular-orbital based machine learning (MOB-ML) framework. An alternative implementation of BBMM (AltBBMM) is also proposed to train more efficiently (over four-fold speedup) with the same accuracy and transferability as the original BBMM implementation. The training of MOB-ML was limited to 220 molecules, and BBMM and AltBBMM scale the training of MOB-ML up by over 30 times to 6500 molecules (more than a million pair energies). The accuracy and transferability of both algorithms are examined on the benchmark datasets of organic molecules with 7 and 13 heavy atoms. These lower-scaling implementations of the GP preserve the state-of-the-art learning efficiency in the low-data regime while extending it to the large-data regime with better accuracy than other available machine learning works on molecular energies.

Cite

Text

Sun et al. "Molecular Energy Learning Using Alternative Blackbox Matrix-Matrix Multiplication Algorithm for Exact Gaussian Process." NeurIPS 2021 Workshops: AI4Science, 2021.

Markdown

[Sun et al. "Molecular Energy Learning Using Alternative Blackbox Matrix-Matrix Multiplication Algorithm for Exact Gaussian Process." NeurIPS 2021 Workshops: AI4Science, 2021.](https://mlanthology.org/neuripsw/2021/sun2021neuripsw-molecular/)

BibTeX

@inproceedings{sun2021neuripsw-molecular,
  title     = {{Molecular Energy Learning Using Alternative Blackbox Matrix-Matrix Multiplication Algorithm for Exact Gaussian Process}},
  author    = {Sun, Jiace and Cheng, Lixue and Miller, Thomas},
  booktitle = {NeurIPS 2021 Workshops: AI4Science},
  year      = {2021},
  url       = {https://mlanthology.org/neuripsw/2021/sun2021neuripsw-molecular/}
}