Exponential Concentration of a Density Functional Estimator
Abstract
We analyse a plug-in estimator for a large class of integral functionals of one or more continuous probability densities. This class includes important families of entropy, divergence, mutual information, and their conditional versions. For densities on the d-dimensional unit cube [0,1]^d that lie in a beta-Holder smoothness class, we prove our estimator converges at the rate O(n^(1/(beta+d))). Furthermore, we prove that the estimator obeys an exponential concentration inequality about its mean, whereas most previous related results have bounded only expected error of estimators. Finally, we demonstrate our bounds to the case of conditional Renyi mutual information.
Cite
Text
Singh and Poczos. "Exponential Concentration of a Density Functional Estimator." Neural Information Processing Systems, 2014.Markdown
[Singh and Poczos. "Exponential Concentration of a Density Functional Estimator." Neural Information Processing Systems, 2014.](https://mlanthology.org/neurips/2014/singh2014neurips-exponential/)BibTeX
@inproceedings{singh2014neurips-exponential,
title = {{Exponential Concentration of a Density Functional Estimator}},
author = {Singh, Shashank and Poczos, Barnabas},
booktitle = {Neural Information Processing Systems},
year = {2014},
pages = {3032-3040},
url = {https://mlanthology.org/neurips/2014/singh2014neurips-exponential/}
}