Uniform Consistency of Cross-Validation Estimators for High-Dimensional Ridge Regression
Abstract
We examine generalized and leave-one-out cross-validation for ridge regression in a proportional asymptotic framework where the dimension of the feature space grows proportionally with the number of observations. Given i.i.d. samples from a linear model with an arbitrary feature covariance and a signal vector that is bounded in $\ell_2$ norm, we show that generalized cross-validation for ridge regression converges almost surely to the expected out-of-sample prediction error, uniformly over a range of ridge regularization parameters that includes zero (and even negative values). We prove the analogous result for leave-one-out cross-validation. As a consequence, we show that ridge tuning via minimization of generalized or leave-one-out cross-validation asymptotically almost surely delivers the optimal level of regularization for predictive accuracy, whether it be positive, negative, or zero.
Cite
Text
Patil et al. "Uniform Consistency of Cross-Validation Estimators for High-Dimensional Ridge Regression." Artificial Intelligence and Statistics, 2021.Markdown
[Patil et al. "Uniform Consistency of Cross-Validation Estimators for High-Dimensional Ridge Regression." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/patil2021aistats-uniform/)BibTeX
@inproceedings{patil2021aistats-uniform,
title = {{Uniform Consistency of Cross-Validation Estimators for High-Dimensional Ridge Regression}},
author = {Patil, Pratik and Wei, Yuting and Rinaldo, Alessandro and Tibshirani, Ryan},
booktitle = {Artificial Intelligence and Statistics},
year = {2021},
pages = {3178-3186},
volume = {130},
url = {https://mlanthology.org/aistats/2021/patil2021aistats-uniform/}
}