Gradient Weights Help Nonparametric Regressors

Abstract

In regression problems over $\real^d$, the unknown function $f$ often varies more in some coordinates than in others. We show that weighting each coordinate $i$ with the estimated norm of the $i$th derivative of $f$ is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and $k$-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.

Cite

Text

Kpotufe and Boularias. "Gradient Weights Help Nonparametric Regressors." Neural Information Processing Systems, 2012.

Markdown

[Kpotufe and Boularias. "Gradient Weights Help Nonparametric Regressors." Neural Information Processing Systems, 2012.](https://mlanthology.org/neurips/2012/kpotufe2012neurips-gradient/)

BibTeX

@inproceedings{kpotufe2012neurips-gradient,
  title     = {{Gradient Weights Help Nonparametric Regressors}},
  author    = {Kpotufe, Samory and Boularias, Abdeslam},
  booktitle = {Neural Information Processing Systems},
  year      = {2012},
  pages     = {2861-2869},
  url       = {https://mlanthology.org/neurips/2012/kpotufe2012neurips-gradient/}
}