Causal Inference Despite Limited Global Confounding via Mixture Models

Abstract

A Bayesian Network is a directed acyclic graph (DAG) on a set of $n$ random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite $k$-mixture of such models is graphically represented by a larger graph which has an additional “hidden” (or “latent”) random variable $U$, ranging in $\{1,\ldots,k\}$, and a directed edge from $U$ to every other vertex. Models of this type are fundamental to causal inference, where $U$ models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution with $U$, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied “product” case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs.

Cite

Text

Gordon et al. "Causal Inference Despite Limited Global Confounding via Mixture Models." Proceedings of the Second Conference on Causal Learning and Reasoning, 2023.

Markdown

[Gordon et al. "Causal Inference Despite Limited Global Confounding via Mixture Models." Proceedings of the Second Conference on Causal Learning and Reasoning, 2023.](https://mlanthology.org/clear/2023/gordon2023clear-causal/)

BibTeX

@inproceedings{gordon2023clear-causal,
  title     = {{Causal Inference Despite Limited Global Confounding via Mixture Models}},
  author    = {Gordon, Spencer L. and Mazaheri, Bijan and Rabani, Yuval and Schulman, Leonard},
  booktitle = {Proceedings of the Second Conference on Causal Learning and Reasoning},
  year      = {2023},
  pages     = {574-601},
  volume    = {213},
  url       = {https://mlanthology.org/clear/2023/gordon2023clear-causal/}
}