Optimal Rates of Sketched-Regularized Algorithms for Least-Squares Regression over Hilbert Spaces
Abstract
We investigate regularized algorithms combining with projection for least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space. We prove convergence results with respect to variants of norms, under a capacity assumption on the hypothesis space and a regularity condition on the target function. As a result, we obtain optimal rates for regularized algorithms with randomized sketches, provided that the sketch dimension is proportional to the effective dimension up to a logarithmic factor. As a byproduct, we obtain similar results for Nyström regularized algorithms. Our results provide optimal, distribution-dependent rates for sketched/Nyström regularized algorithms, considering both the attainable and non-attainable cases.
Cite
Text
Lin and Cevher. "Optimal Rates of Sketched-Regularized Algorithms for Least-Squares Regression over Hilbert Spaces." International Conference on Machine Learning, 2018.Markdown
[Lin and Cevher. "Optimal Rates of Sketched-Regularized Algorithms for Least-Squares Regression over Hilbert Spaces." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/lin2018icml-optimal/)BibTeX
@inproceedings{lin2018icml-optimal,
title = {{Optimal Rates of Sketched-Regularized Algorithms for Least-Squares Regression over Hilbert Spaces}},
author = {Lin, Junhong and Cevher, Volkan},
booktitle = {International Conference on Machine Learning},
year = {2018},
pages = {3102-3111},
volume = {80},
url = {https://mlanthology.org/icml/2018/lin2018icml-optimal/}
}