Consistency of Robust Kernel Density Estimators
Abstract
The kernel density estimator (KDE) based on a radial positive-semidefinite kernel may be viewed as a sample mean in a reproducing kernel Hilbert space. This mean can be viewed as the solution of a least squares problem in that space. Replacing the squared loss with a robust loss yields a robust kernel density estimator (RKDE). Previous work has shown that RKDEs are weighted kernel density estimators which have desirable robustness properties. In this paper we establish asymptotic L^1 consistency of the RKDE for a class of losses and show that the RKDE converges with the same rate on bandwidth required for the traditional KDE. We also present a novel proof of the consistency of the traditional KDE.
Cite
Text
Vandermeulen and Scott. "Consistency of Robust Kernel Density Estimators." Annual Conference on Computational Learning Theory, 2013.Markdown
[Vandermeulen and Scott. "Consistency of Robust Kernel Density Estimators." Annual Conference on Computational Learning Theory, 2013.](https://mlanthology.org/colt/2013/vandermeulen2013colt-consistency/)BibTeX
@inproceedings{vandermeulen2013colt-consistency,
title = {{Consistency of Robust Kernel Density Estimators}},
author = {Vandermeulen, Robert A. and Scott, Clayton D.},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2013},
pages = {568-591},
url = {https://mlanthology.org/colt/2013/vandermeulen2013colt-consistency/}
}