Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence
Abstract
Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.
Cite
Text
Noh et al. "Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence." International Conference on Artificial Intelligence and Statistics, 2014. doi:10.1162/neco_a_01092Markdown
[Noh et al. "Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence." International Conference on Artificial Intelligence and Statistics, 2014.](https://mlanthology.org/aistats/2014/noh2014aistats-bias/) doi:10.1162/neco_a_01092BibTeX
@inproceedings{noh2014aistats-bias,
title = {{Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence}},
author = {Noh, Yung-Kyun and Sugiyama, Masashi and Liu, Song and du Plessis, Marthinus Christoffel and Park, Frank Chongwoo and Lee, Daniel D.},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2014},
pages = {669-677},
doi = {10.1162/neco_a_01092},
url = {https://mlanthology.org/aistats/2014/noh2014aistats-bias/}
}