On the Optimality of Misspecified Kernel Ridge Regression

Abstract

In the misspecified kernel ridge regression problem, researchers usually assume the underground true function $f_{\rho}^{\star} \in [\mathcal{H}]^{s}$, a less-smooth interpolation space of a reproducing kernel Hilbert space (RKHS) $\mathcal{H}$ for some $s\in (0,1)$. The existing minimax optimal results require $\left\Vert f_{\rho}^{\star} \right \Vert_{L^{\infty}} < \infty$ which implicitly requires $s > \alpha_{0}$ where $\alpha_{0} \in (0,1) $ is the embedding index, a constant depending on $\mathcal{H}$. Whether the KRR is optimal for all $s\in (0,1)$ is an outstanding problem lasting for years. In this paper, we show that KRR is minimax optimal for any $s\in (0,1)$ when the $\mathcal{H}$ is a Sobolev RKHS.

Cite

Text

Zhang et al. "On the Optimality of Misspecified Kernel Ridge Regression." International Conference on Machine Learning, 2023.

Markdown

[Zhang et al. "On the Optimality of Misspecified Kernel Ridge Regression." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/zhang2023icml-optimality/)

BibTeX

@inproceedings{zhang2023icml-optimality,
  title     = {{On the Optimality of Misspecified Kernel Ridge Regression}},
  author    = {Zhang, Haobo and Li, Yicheng and Lu, Weihao and Lin, Qian},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {41331-41353},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/zhang2023icml-optimality/}
}