Differentiable Constraint-Based Causal Discovery

Abstract

Causal discovery from observational data is a fundamental task in artificial intelligence, with far-reaching implications for decision-making, predictions, and interventions. Despite significant advances, existing methods can be broadly categorized as constraint-based or score-based approaches. Constraint-based methods offer rigorous causal discovery but are often hindered by small sample sizes, while score-based methods provide flexible optimization but typically forgo explicit conditional independence testing. This work explores a third avenue: developing differentiable $d$-separation scores, obtained through a percolation theory using soft logic. This enables the implementation of a new type of causal discovery method: gradient-based optimization of conditional independence constraints. Empirical evaluations demonstrate the robust performance of our approach in low-sample regimes, surpassing traditional constraint-based and score-based baselines on a real-world dataset. Code implementing the proposed method is publicly available at [https://github.com/PurdueMINDS/DAGPA](https://github.com/PurdueMINDS/DAGPA).

Cite

Text

Zhou et al. "Differentiable Constraint-Based Causal Discovery." Advances in Neural Information Processing Systems, 2025.

Markdown

[Zhou et al. "Differentiable Constraint-Based Causal Discovery." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/zhou2025neurips-differentiable/)

BibTeX

@inproceedings{zhou2025neurips-differentiable,
  title     = {{Differentiable Constraint-Based Causal Discovery}},
  author    = {Zhou, Jincheng and Wang, Mengbo and He, Anqi and Zhou, Yumeng and Olya, Hessam and Kocaoglu, Murat and Ribeiro, Bruno},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/zhou2025neurips-differentiable/}
}