Smoothed Nonparametric Derivative Estimation Using Weighted Difference Quotients

Abstract

Derivatives play an important role in bandwidth selection methods (e.g., plug-ins), data analysis and bias-corrected confidence intervals. Therefore, obtaining accurate derivative information is crucial. Although many derivative estimation methods exist, the majority require a fixed design assumption. In this paper, we propose an effective and fully data-driven framework to estimate the first and second order derivative in random design. We establish the asymptotic properties of the proposed derivative estimator, and also propose a fast selection method for the tuning parameters. The performance and flexibility of the method is illustrated via an extensive simulation study.

Cite

Text

Liu and De Brabanter. "Smoothed Nonparametric Derivative Estimation Using Weighted Difference Quotients." Journal of Machine Learning Research, 2020.

Markdown

[Liu and De Brabanter. "Smoothed Nonparametric Derivative Estimation Using Weighted Difference Quotients." Journal of Machine Learning Research, 2020.](https://mlanthology.org/jmlr/2020/liu2020jmlr-smoothed/)

BibTeX

@article{liu2020jmlr-smoothed,
  title     = {{Smoothed Nonparametric Derivative Estimation Using Weighted Difference Quotients}},
  author    = {Liu, Yu and De Brabanter, Kris},
  journal   = {Journal of Machine Learning Research},
  year      = {2020},
  pages     = {1-45},
  volume    = {21},
  url       = {https://mlanthology.org/jmlr/2020/liu2020jmlr-smoothed/}
}