Formal Limitations on the Measurement of Mutual Information
Abstract
Measuring mutual information from finite data is difficult. Recent work has considered variational methods maximizing a lower bound. In this paper, we prove that serious statistical limitations are inherent to any method of measuring mutual information. More specifically, we show that any distribution-free high-confidence lower bound on mutual information estimated from N samples cannot be larger than O(ln N).
Cite
Text
McAllester and Stratos. "Formal Limitations on the Measurement of Mutual Information." Artificial Intelligence and Statistics, 2020.Markdown
[McAllester and Stratos. "Formal Limitations on the Measurement of Mutual Information." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/mcallester2020aistats-formal/)BibTeX
@inproceedings{mcallester2020aistats-formal,
title = {{Formal Limitations on the Measurement of Mutual Information}},
author = {McAllester, David and Stratos, Karl},
booktitle = {Artificial Intelligence and Statistics},
year = {2020},
pages = {875-884},
volume = {108},
url = {https://mlanthology.org/aistats/2020/mcallester2020aistats-formal/}
}