Fast Rates for Noisy Interpolation Require Rethinking the Effect of Inductive Bias
Abstract
Good generalization performance on high-dimensional data crucially hinges on a simple structure of the ground truth and a corresponding strong inductive bias of the estimator. Even though this intuition is valid for regularized models, in this paper we caution against a strong inductive bias for interpolation in the presence of noise: While a stronger inductive bias encourages a simpler structure that is more aligned with the ground truth, it also increases the detrimental effect of noise. Specifically, for both linear regression and classification with a sparse ground truth, we prove that minimum $\ell_p$-norm and maximum $\ell_p$-margin interpolators achieve fast polynomial rates close to order $1/n$ for $p > 1$ compared to a logarithmic rate for $p = 1$. Finally, we provide preliminary experimental evidence that this trade-off may also play a crucial role in understanding non-linear interpolating models used in practice.
Cite
Text
Donhauser et al. "Fast Rates for Noisy Interpolation Require Rethinking the Effect of Inductive Bias." International Conference on Machine Learning, 2022.Markdown
[Donhauser et al. "Fast Rates for Noisy Interpolation Require Rethinking the Effect of Inductive Bias." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/donhauser2022icml-fast/)BibTeX
@inproceedings{donhauser2022icml-fast,
title = {{Fast Rates for Noisy Interpolation Require Rethinking the Effect of Inductive Bias}},
author = {Donhauser, Konstantin and Ruggeri, Nicolò and Stojanovic, Stefan and Yang, Fanny},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {5397-5428},
volume = {162},
url = {https://mlanthology.org/icml/2022/donhauser2022icml-fast/}
}