Tight Bounds for Minimum $\ell_1$-Norm Interpolation of Noisy Data

Abstract

We provide matching upper and lower bounds of order $\sigma^2/\log(d/n)$ for the prediction error of the minimum $\ell_1$-norm interpolator, a.k.a. basis pursuit. Our result is tight up to negligible terms when $d \gg n$, and is the first to imply asymptotic consistency of noisy minimum-norm interpolation for isotropic features and sparse ground truths. Our work complements the literature on "benign overfitting" for minimum $\ell_2$-norm interpolation, where asymptotic consistency can be achieved only when the features are effectively low-dimensional.

Cite

Text

Wang et al. "Tight Bounds for Minimum $\ell_1$-Norm Interpolation of Noisy Data." Artificial Intelligence and Statistics, 2022.

Markdown

[Wang et al. "Tight Bounds for Minimum $\ell_1$-Norm Interpolation of Noisy Data." Artificial Intelligence and Statistics, 2022.](https://mlanthology.org/aistats/2022/wang2022aistats-tight/)

BibTeX

@inproceedings{wang2022aistats-tight,
  title     = {{Tight Bounds for Minimum $\ell_1$-Norm Interpolation of Noisy Data}},
  author    = {Wang, Guillaume and Donhauser, Konstantin and Yang, Fanny},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2022},
  pages     = {10572-10602},
  volume    = {151},
  url       = {https://mlanthology.org/aistats/2022/wang2022aistats-tight/}
}