DiffGraphTrans: A Differential Attention-Based Approach for Extracting Meaningful Features of Drug Combinations

Abstract

Predicting synergistic drug combinations is critical for treating complex diseases, yet existing graph-based methods struggle to balance noise suppression and interpretability in molecular representations. Specifically, the heterogeneity of molecular graphs causes Transformer-based models to amplify high-frequency noise while masking low-frequency signals linked to functional groups. To address this, we propose the Differential Graph Transformer (DiffGraphTrans), which integrates a learnable differential filter into multi-head attention. Our model dynamically suppresses irrelevant atomic interactions and amplifies key functional groups. Experiments on lung cancer drug combinations show that DiffGraphTrans outperforms baseline models and significantly improves biochemical interpretability through attention weight analysis. Our framework provides a principled approach to learning robust embeddings based on noise and biologically meaningful, advancing interpretable AI for drug discovery.

Cite

Text

Wu and Wang. "DiffGraphTrans: A Differential Attention-Based Approach for Extracting Meaningful Features of Drug Combinations." ICLR 2025 Workshops: LMRL, 2025.

Markdown

[Wu and Wang. "DiffGraphTrans: A Differential Attention-Based Approach for Extracting Meaningful Features of Drug Combinations." ICLR 2025 Workshops: LMRL, 2025.](https://mlanthology.org/iclrw/2025/wu2025iclrw-diffgraphtrans/)

BibTeX

@inproceedings{wu2025iclrw-diffgraphtrans,
  title     = {{DiffGraphTrans: A Differential Attention-Based Approach for Extracting Meaningful Features of Drug Combinations}},
  author    = {Wu, Bingzheng and Wang, Qi},
  booktitle = {ICLR 2025 Workshops: LMRL},
  year      = {2025},
  url       = {https://mlanthology.org/iclrw/2025/wu2025iclrw-diffgraphtrans/}
}