Nonparanormal Information Estimation
Abstract
We study the problem of using i.i.d. samples from an unknown multivariate probability distribution p to estimate the mutual information of p. This problem has recently received attention in two settings: (1) where p is assumed to be Gaussian and (2) where p is assumed only to lie in a large nonparametric smoothness class. Estimators proposed for the Gaussian case converge in high dimensions when the Gaussian assumption holds, but are brittle, failing dramatically when p is not Gaussian, while estimators proposed for the nonparametric case fail to converge with realistic sample sizes except in very low dimension. Hence, there is a lack of robust mutual information estimators for many realistic data. To address this, we propose estimators for mutual information when p is assumed to be a nonparanormal (or Gaussian copula) model, a semiparametric compromise between Gaussian and nonparametric extremes. Using theoretical bounds and experiments, we show these estimators strike a practical balance between robustness and scalability.
Cite
Text
Singh and Póczos. "Nonparanormal Information Estimation." International Conference on Machine Learning, 2017.Markdown
[Singh and Póczos. "Nonparanormal Information Estimation." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/singh2017icml-nonparanormal/)BibTeX
@inproceedings{singh2017icml-nonparanormal,
title = {{Nonparanormal Information Estimation}},
author = {Singh, Shashank and Póczos, Barnabás},
booktitle = {International Conference on Machine Learning},
year = {2017},
pages = {3210-3219},
volume = {70},
url = {https://mlanthology.org/icml/2017/singh2017icml-nonparanormal/}
}