Fundamental Properties of Causal Entropy and Information Gain
Abstract
Recent developments enable the quantification of causal control given a structural causal model (SCM). This has been accomplished by introducing quantities which encode changes in the entropy of one variable when intervening on another. These measures, named causal entropy and causal information gain, aim to address limitations in existing information theoretical approaches for machine learning tasks where causality plays a crucial role. They have not yet been properly mathematically studied. Our research contributes to the formal understanding of the notions of causal entropy and causal information gain by establishing and analyzing fundamental properties of these concepts, including bounds and chain rules. Furthermore, we elucidate the relationship between causal entropy and stochastic interventions. We also propose definitions for causal conditional entropy and causal conditional information gain. Overall, this exploration paves the way for enhancing causal machine learning tasks through the study of recently-proposed information theoretic quantities grounded in considerations about causality.
Cite
Text
Simoes et al. "Fundamental Properties of Causal Entropy and Information Gain." Proceedings of the Third Conference on Causal Learning and Reasoning, 2024.Markdown
[Simoes et al. "Fundamental Properties of Causal Entropy and Information Gain." Proceedings of the Third Conference on Causal Learning and Reasoning, 2024.](https://mlanthology.org/clear/2024/simoes2024clear-fundamental/)BibTeX
@inproceedings{simoes2024clear-fundamental,
title = {{Fundamental Properties of Causal Entropy and Information Gain}},
author = {Simoes, Francisco N. F. Q. and Dastani, Mehdi and van Ommen, Thijs},
booktitle = {Proceedings of the Third Conference on Causal Learning and Reasoning},
year = {2024},
pages = {188-208},
volume = {236},
url = {https://mlanthology.org/clear/2024/simoes2024clear-fundamental/}
}