A Recipe for Causal Graph Regression: Confounding Effects Revisited
Abstract
Through recognizing causal subgraphs, causal graph learning (CGL) has risen to be a promising approach for improving the generalizability of graph neural networks under out-of-distribution (OOD) scenarios. However, the empirical successes of CGL techniques are mostly exemplified in classification settings, while regression tasks, a more challenging setting in graph learning, are overlooked. We thus devote this work to tackling causal graph regression (CGR); to this end we reshape the processing of confounding effects in existing CGL studies, which mainly deal with classification. Specifically, we reflect on the predictive power of confounders in graph-level regression, and generalize classification-specific causal intervention techniques to regression through a lens of contrastive learning. Extensive experiments on graph OOD benchmarks validate the efficacy of our proposals for CGR. The model implementation and the code are provided on https://github.com/causal-graph/CGR.
Cite
Text
Yin et al. "A Recipe for Causal Graph Regression: Confounding Effects Revisited." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Yin et al. "A Recipe for Causal Graph Regression: Confounding Effects Revisited." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/yin2025icml-recipe/)BibTeX
@inproceedings{yin2025icml-recipe,
title = {{A Recipe for Causal Graph Regression: Confounding Effects Revisited}},
author = {Yin, Yujia and Qu, Tianyi and Wang, Zihao and Chen, Yifan},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {72414-72427},
volume = {267},
url = {https://mlanthology.org/icml/2025/yin2025icml-recipe/}
}