The Nearest Neighbor Information Estimator Is Adaptively near Minimax Rate-Optimal
Abstract
We analyze the Kozachenko–Leonenko (KL) fixed k-nearest neighbor estimator for the differential entropy. We obtain the first uniform upper bound on its performance for any fixed k over H\"older balls on a torus without assuming any conditions on how close the density could be from zero. Accompanying a recent minimax lower bound over the H\"older ball, we show that the KL estimator for any fixed k is achieving the minimax rates up to logarithmic factors without cognizance of the smoothness parameter s of the H\"older ball for $s \in (0,2]$ and arbitrary dimension d, rendering it the first estimator that provably satisfies this property.
Cite
Text
Jiao et al. "The Nearest Neighbor Information Estimator Is Adaptively near Minimax Rate-Optimal." Neural Information Processing Systems, 2018.Markdown
[Jiao et al. "The Nearest Neighbor Information Estimator Is Adaptively near Minimax Rate-Optimal." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/jiao2018neurips-nearest/)BibTeX
@inproceedings{jiao2018neurips-nearest,
title = {{The Nearest Neighbor Information Estimator Is Adaptively near Minimax Rate-Optimal}},
author = {Jiao, Jiantao and Gao, Weihao and Han, Yanjun},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {3156-3167},
url = {https://mlanthology.org/neurips/2018/jiao2018neurips-nearest/}
}