Robust Kernel Density Estimation
Abstract
We propose a method for nonparametric density estimation that exhibits robustness to contamination of the training sample. This method achieves robustness by combining a traditional kernel density estimator (KDE) with ideas from classical M-estimation. We interpret the KDE based on a positive semi-definite kernel as a sample mean in the associated reproducing kernel Hilbert space. Since the sample mean is sensitive to outliers, we estimate it robustly via M-estimation, yielding a robust kernel density estimator (RKDE). An RKDE can be computed efficiently via a kernelized iteratively re-weighted least squares (IRWLS) algorithm. Necessary and sufficient conditions are given for kernelized IRWLS to converge to the global minimizer of the M-estimator objective function. The robustness of the RKDE is demonstrated with a representer theorem, the influence function, and experimental results for density estimation and anomaly detection.
Cite
Text
Kim and Scott. "Robust Kernel Density Estimation." Journal of Machine Learning Research, 2012.Markdown
[Kim and Scott. "Robust Kernel Density Estimation." Journal of Machine Learning Research, 2012.](https://mlanthology.org/jmlr/2012/kim2012jmlr-robust/)BibTeX
@article{kim2012jmlr-robust,
title = {{Robust Kernel Density Estimation}},
author = {Kim, JooSeuk and Scott, Clayton D.},
journal = {Journal of Machine Learning Research},
year = {2012},
pages = {2529-2565},
volume = {13},
url = {https://mlanthology.org/jmlr/2012/kim2012jmlr-robust/}
}