KGCL: Knowledge-Enhanced Graph Contrastive Learning for Retrosynthesis Prediction Based on Molecular Graph Editing

Abstract

Retrosynthesis, which predicts the reactants of a given target molecule, is an essential task for drug discovery. Retrosynthesis prediction based on molecular graph editing has garnered widespread attention due to excellent interpretability. Existing methods fail to effectively incorporate the chemical knowledge when learning molecular representations. To address this issue, we propose a Knowledge-enhanced Graph Contrastive Learning model (KGCL), which retrieve functional group embeddings from a chemical knowledge graph and integrate them into the atomic embeddings of the product molecule using an attention mechanism. Furthermore, we introduce a graph contrastive learning strategy that generates augmented samples using graph edits to improve the molecular graph encoder. Our proposed method outperforms the strong baseline method Graph2Edits by 1.6% and 3.2% in terms of the top-1 accuracy and top-1 round-trip accuracy on the USPTO-50K dataset, respectively, and also achieves a new state-of-the-art performance among semi-template-based methods on the USPTO-FULL dataset.

Cite

Text

Yang et al. "KGCL: Knowledge-Enhanced Graph Contrastive Learning for Retrosynthesis Prediction Based on Molecular Graph Editing." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/1048

Markdown

[Yang et al. "KGCL: Knowledge-Enhanced Graph Contrastive Learning for Retrosynthesis Prediction Based on Molecular Graph Editing." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/yang2025ijcai-kgcl/) doi:10.24963/IJCAI.2025/1048

BibTeX

@inproceedings{yang2025ijcai-kgcl,
  title     = {{KGCL: Knowledge-Enhanced Graph Contrastive Learning for Retrosynthesis Prediction Based on Molecular Graph Editing}},
  author    = {Yang, Fengqin and Zhao, Dekui and Qiu, Haoxuan and Li, Yifei and Fu, Zhiguo},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {9429-9437},
  doi       = {10.24963/IJCAI.2025/1048},
  url       = {https://mlanthology.org/ijcai/2025/yang2025ijcai-kgcl/}
}