Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation
Abstract
Estimation of density derivatives is a versatile tool in statistical data analysis. A naive approach is to first estimate the density and then compute its derivative. However, such a two-step approach does not work well because a good density estimator does not necessarily mean a good density-derivative estimator. In this paper, we give a direct method to approximate the density derivative without estimating the density itself. Our proposed estimator allows analytic and computationally efficient approximation of multi-dimensional high-order density derivatives, with the ability that all hyper-parameters can be chosen objectively by cross-validation. We further show that the proposed density-derivative estimator is useful in improving the accuracy of non-parametric KL-divergence estimation via metric learning. The practical superiority of the proposed method is experimentally demonstrated in change detection and feature selection.
Cite
Text
Sasaki et al. "Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation." International Conference on Artificial Intelligence and Statistics, 2015.Markdown
[Sasaki et al. "Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation." International Conference on Artificial Intelligence and Statistics, 2015.](https://mlanthology.org/aistats/2015/sasaki2015aistats-direct/)BibTeX
@inproceedings{sasaki2015aistats-direct,
title = {{Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation}},
author = {Sasaki, Hiroaki and Noh, Yung-Kyun and Sugiyama, Masashi},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2015},
url = {https://mlanthology.org/aistats/2015/sasaki2015aistats-direct/}
}