Adaptive and Non-Adaptive Minimax Rates for Weighted Laplacian-Eigenmap Based Nonparametric Regression
Abstract
We show both adaptive and non-adaptive minimax rates of convergence for a family of weighted Laplacian-Eigenmap based nonparametric regression methods, when the true regression function belongs to a Sobolev space and the sampling density is bounded from above and below. The adaptation methodology is based on extensions of Lepski’s method and is over both the smoothness parameter ($s\in\mathbb{N}_{+}$) and the norm parameter ($M>0$) determining the constraints on the Sobolev space. Our results extend the non-adaptive result in Green et al., (2023), established for a specific normalized graph Laplacian, to a wide class of weighted Laplacian matrices used in practice, including the unnormalized Laplacian and random walk Laplacian.
Cite
Text
Shi et al. "Adaptive and Non-Adaptive Minimax Rates for Weighted Laplacian-Eigenmap Based Nonparametric Regression." Artificial Intelligence and Statistics, 2024.Markdown
[Shi et al. "Adaptive and Non-Adaptive Minimax Rates for Weighted Laplacian-Eigenmap Based Nonparametric Regression." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/shi2024aistats-adaptive/)BibTeX
@inproceedings{shi2024aistats-adaptive,
title = {{Adaptive and Non-Adaptive Minimax Rates for Weighted Laplacian-Eigenmap Based Nonparametric Regression}},
author = {Shi, Zhaoyang and Balasubramanian, Krishna and Polonik, Wolfgang},
booktitle = {Artificial Intelligence and Statistics},
year = {2024},
pages = {2800-2808},
volume = {238},
url = {https://mlanthology.org/aistats/2024/shi2024aistats-adaptive/}
}