Nonparametric Von Mises Estimators for Entropies, Divergences and Mutual Informations

Abstract

We propose and analyse estimators for statistical functionals of one or moredistributions under nonparametric assumptions.Our estimators are derived from the von Mises expansion andare based on the theory of influence functions, which appearin the semiparametric statistics literature.We show that estimators based either on data-splitting or a leave-one-out techniqueenjoy fast rates of convergence and other favorable theoretical properties.We apply this framework to derive estimators for several popular informationtheoretic quantities, and via empirical evaluation, show the advantage of thisapproach over existing estimators.

Cite

Text

Kandasamy et al. "Nonparametric Von Mises Estimators for Entropies, Divergences and Mutual Informations." Neural Information Processing Systems, 2015.

Markdown

[Kandasamy et al. "Nonparametric Von Mises Estimators for Entropies, Divergences and Mutual Informations." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/kandasamy2015neurips-nonparametric/)

BibTeX

@inproceedings{kandasamy2015neurips-nonparametric,
  title     = {{Nonparametric Von Mises Estimators for Entropies, Divergences and Mutual Informations}},
  author    = {Kandasamy, Kirthevasan and Krishnamurthy, Akshay and Poczos, Barnabas and Wasserman, Larry and Robins, James M},
  booktitle = {Neural Information Processing Systems},
  year      = {2015},
  pages     = {397-405},
  url       = {https://mlanthology.org/neurips/2015/kandasamy2015neurips-nonparametric/}
}