Generalized Exponential Concentration Inequality for Renyi Divergence Estimation

Abstract

Estimating divergences between probability distributions in a consistent way is of great importance in many machine learning tasks. Although this is a fundamental problem in nonparametric statistics, to the best of our knowledge there has been no finite sample exponential inequality convergence bound derived for any divergence estimators. The main contribution of our work is to provide such a bound for an estimator of Renyi divergence for a smooth Holder class of densities on the d-dimensional unit cube. We also illustrate our theoretical results with a numerical experiment.

Cite

Text

Singh and Poczos. "Generalized Exponential Concentration Inequality for Renyi Divergence Estimation." International Conference on Machine Learning, 2014.

Markdown

[Singh and Poczos. "Generalized Exponential Concentration Inequality for Renyi Divergence Estimation." International Conference on Machine Learning, 2014.](https://mlanthology.org/icml/2014/singh2014icml-generalized/)

BibTeX

@inproceedings{singh2014icml-generalized,
  title     = {{Generalized Exponential Concentration Inequality for Renyi Divergence Estimation}},
  author    = {Singh, Shashank and Poczos, Barnabas},
  booktitle = {International Conference on Machine Learning},
  year      = {2014},
  pages     = {333-341},
  volume    = {32},
  url       = {https://mlanthology.org/icml/2014/singh2014icml-generalized/}
}