Causal Discovery from Conditionally Stationary Time Series
Abstract
Causal discovery, i.e., inferring underlying causal relationships from observational data, is highly challenging for AI systems. In a time series modeling context, traditional causal discovery methods mainly consider constrained scenarios with fully observed variables and/or data from stationary time-series. We develop a causal discovery approach to handle a wide class of nonstationary time series that are conditionally stationary, where the nonstationary behaviour is modeled as stationarity conditioned on a set of latent state variables. Named State-Dependent Causal Inference (SDCI), our approach is able to recover the underlying causal dependencies, with provable identifiablity for the state-dependent causal structures. Empirical experiments on nonlinear particle interaction data and gene regulatory networks demonstrate SDCI’s superior performance over baseline causal discovery methods. Improved results over non-causal RNNs on modeling NBA player movements demonstrate the potential of our method and motivate the use of causality-driven methods for forecasting.
Cite
Text
Balsells-Rodas et al. "Causal Discovery from Conditionally Stationary Time Series." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Balsells-Rodas et al. "Causal Discovery from Conditionally Stationary Time Series." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/balsellsrodas2025icml-causal/)BibTeX
@inproceedings{balsellsrodas2025icml-causal,
title = {{Causal Discovery from Conditionally Stationary Time Series}},
author = {Balsells-Rodas, Carles and Sumba, Xavier and Narendra, Tanmayee and Tu, Ruibo and Schweikert, Gabriele and Kjellstrom, Hedvig and Li, Yingzhen},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {2715-2741},
volume = {267},
url = {https://mlanthology.org/icml/2025/balsellsrodas2025icml-causal/}
}