Rodeo: Sparse Nonparametric Regression in High Dimensions
Abstract
We present a method for nonparametric regression that performs bandwidth selection and variable selection simultaneously. The approach is based on the technique of incrementally decreasing the bandwidth in directions where the gradient of the estimator with respect to bandwidth is large. When the unknown function satisfies a sparsity condition, our approach avoids the curse of dimensionality, achieving the optimal minimax rate of convergence, up to logarithmic factors, as if the relevant variables were known in advance. The method--called rodeo (regularization of derivative expectation operator)--conducts a sequence of hypothesis tests, and is easy to implement. A modified version that replaces hard with soft thresholding effectively solves a sequence of lasso problems.
Cite
Text
Wasserman and Lafferty. "Rodeo: Sparse Nonparametric Regression in High Dimensions." Neural Information Processing Systems, 2005.Markdown
[Wasserman and Lafferty. "Rodeo: Sparse Nonparametric Regression in High Dimensions." Neural Information Processing Systems, 2005.](https://mlanthology.org/neurips/2005/wasserman2005neurips-rodeo/)BibTeX
@inproceedings{wasserman2005neurips-rodeo,
title = {{Rodeo: Sparse Nonparametric Regression in High Dimensions}},
author = {Wasserman, Larry and Lafferty, John D.},
booktitle = {Neural Information Processing Systems},
year = {2005},
pages = {707-714},
url = {https://mlanthology.org/neurips/2005/wasserman2005neurips-rodeo/}
}