Sample Complexity and Effective Dimension for Regression on Manifolds
Abstract
We consider the theory of regression on a manifold using reproducing kernel Hilbert space methods. Manifold models arise in a wide variety of modern machine learning problems, and our goal is to help understand the effectiveness of various implicit and explicit dimensionality-reduction methods that exploit manifold structure. Our first key contribution is to establish a novel nonasymptotic version of the Weyl law from differential geometry. From this we are able to show that certain spaces of smooth functions on a manifold are effectively finite-dimensional, with a complexity that scales according to the manifold dimension rather than any ambient data dimension. Finally, we show that given (potentially noisy) function values taken uniformly at random over a manifold, a kernel regression estimator (derived from the spectral decomposition of the manifold) yields minimax-optimal error bounds that are controlled by the effective dimension.
Cite
Text
McRae et al. "Sample Complexity and Effective Dimension for Regression on Manifolds." Neural Information Processing Systems, 2020.Markdown
[McRae et al. "Sample Complexity and Effective Dimension for Regression on Manifolds." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/mcrae2020neurips-sample/)BibTeX
@inproceedings{mcrae2020neurips-sample,
title = {{Sample Complexity and Effective Dimension for Regression on Manifolds}},
author = {McRae, Andrew and Romberg, Justin and Davenport, Mark},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/mcrae2020neurips-sample/}
}