Supervised Pretraining for Molecular Force Fields and Properties Prediction

Abstract

Machine learning approaches have become popular for molecular modeling tasks, including molecular force fields and properties prediction. Traditional supervised learning methods suffer from scarcity of labeled data for particular tasks, motivating the use of large-scale dataset for other relevant tasks. We propose to pretrain neural networks on a dataset of 86 millions of molecules with atom charges and 3D geometries as inputs and molecular energies as labels. Experiments show that, compared to training from scratch, fine-tuning the pretrained model can significantly improve the performance for seven molecular property prediction tasks and two force field tasks. We also demonstrate that the learned representations from the pretrained model contain adequate information about molecular structures, by showing that linear probing of the representations can predict many molecular information including atom types, interatomic distances, class of molecular scaffolds, and existence of molecular fragments. Our results show that supervised pretraining is a promising research direction in molecular modeling.

Cite

Text

Gao et al. "Supervised Pretraining for Molecular Force Fields and Properties Prediction." NeurIPS 2022 Workshops: AI4Science, 2022.

Markdown

[Gao et al. "Supervised Pretraining for Molecular Force Fields and Properties Prediction." NeurIPS 2022 Workshops: AI4Science, 2022.](https://mlanthology.org/neuripsw/2022/gao2022neuripsw-supervised/)

BibTeX

@inproceedings{gao2022neuripsw-supervised,
  title     = {{Supervised Pretraining for Molecular Force Fields and Properties Prediction}},
  author    = {Gao, Xiang and Gao, Weihao and Xiao, Wenzhi and Wang, Zhirui and Wang, Chong and Xiang, Liang},
  booktitle = {NeurIPS 2022 Workshops: AI4Science},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/gao2022neuripsw-supervised/}
}