Non-Asymptotic Uniform Rates of Consistency for k-NN Regression
Abstract
We derive high-probability finite-sample uniform rates of consistency for k-NN regression that are optimal up to logarithmic factors under mild assumptions. We moreover show that k-NN regression adapts to an unknown lower intrinsic dimension automatically in the sup-norm. We then apply the k-NN regression rates to establish new results about estimating the level sets and global maxima of a function from noisy observations.
Cite
Text
Jiang. "Non-Asymptotic Uniform Rates of Consistency for k-NN Regression." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33013999Markdown
[Jiang. "Non-Asymptotic Uniform Rates of Consistency for k-NN Regression." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/jiang2019aaai-non/) doi:10.1609/AAAI.V33I01.33013999BibTeX
@inproceedings{jiang2019aaai-non,
title = {{Non-Asymptotic Uniform Rates of Consistency for k-NN Regression}},
author = {Jiang, Heinrich},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2019},
pages = {3999-4006},
doi = {10.1609/AAAI.V33I01.33013999},
url = {https://mlanthology.org/aaai/2019/jiang2019aaai-non/}
}