Asymptotic Normality and Confidence Intervals for Prediction Risk of the Min-Norm Least Squares Estimator

Abstract

This paper quantifies the uncertainty of prediction risk for the min-norm least squares estimator in high-dimensional linear regression models. We establish the asymptotic normality of prediction risk when both the sample size and the number of features tend to infinity. Based on the newly established central limit theorems(CLTs), we derive the confidence intervals of the prediction risk under various scenarios. Our results demonstrate the sample-wise non-monotonicity of the prediction risk and confirm “more data hurt" phenomenon. Furthermore, the width of confidence intervals indicates that over-parameterization would enlarge the randomness of prediction performance.

Cite

Text

Li et al. "Asymptotic Normality and Confidence Intervals for Prediction Risk of the Min-Norm Least Squares Estimator." International Conference on Machine Learning, 2021.

Markdown

[Li et al. "Asymptotic Normality and Confidence Intervals for Prediction Risk of the Min-Norm Least Squares Estimator." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/li2021icml-asymptotic/)

BibTeX

@inproceedings{li2021icml-asymptotic,
  title     = {{Asymptotic Normality and Confidence Intervals for Prediction Risk of the Min-Norm Least Squares Estimator}},
  author    = {Li, Zeng and Xie, Chuanlong and Wang, Qinwen},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {6533-6542},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/li2021icml-asymptotic/}
}