Simple GNN Regularisation for 3D Molecular Property Prediction and Beyond
Abstract
In this paper we show that simple noisy regularisation can be an effective way to address oversmoothing. We first argue that regularisers ad-dressing oversmoothing should both penalise node latent similarity and encourage meaningful node representations. From this observation we derive “Noisy Nodes”,a simple technique in which we corrupt the input graph with noise, and add a noise correcting node-level loss. The diverse node level loss encourages latent node diversity, and the denoising objective encourages graph manifold learning. Our regulariser applies well-studied methods in simple, straightforward ways which allow even generic architectures to overcome oversmoothing and achieve state of the art results on quantum chemistry tasks such as QM9 and Open Catalyst, and improve results significantly on Open Graph Benchmark (OGB) datasets. Our results suggest Noisy Nodes can serve as a complementary building block in the GNN toolkit.
Cite
Text
Godwin et al. "Simple GNN Regularisation for 3D Molecular Property Prediction and Beyond." International Conference on Learning Representations, 2022.Markdown
[Godwin et al. "Simple GNN Regularisation for 3D Molecular Property Prediction and Beyond." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/godwin2022iclr-simple/)BibTeX
@inproceedings{godwin2022iclr-simple,
title = {{Simple GNN Regularisation for 3D Molecular Property Prediction and Beyond}},
author = {Godwin, Jonathan and Schaarschmidt, Michael and Gaunt, Alexander L and Sanchez-Gonzalez, Alvaro and Rubanova, Yulia and Veličković, Petar and Kirkpatrick, James and Battaglia, Peter},
booktitle = {International Conference on Learning Representations},
year = {2022},
url = {https://mlanthology.org/iclr/2022/godwin2022iclr-simple/}
}