Consistency of Interpolation with Laplace Kernels Is a High-Dimensional Phenomenon

Abstract

We show that minimum-norm interpolation in the Reproducing Kernel Hilbert Space corresponding to the Laplace kernel is not consistent if input dimension is constant. The lower bound holds for any choice of kernel bandwidth, even if selected based on data. The result supports the empirical observation that minimum-norm interpolation (that is, exact fit to training data) in RKHS generalizes well for some high-dimensional datasets, but not for low-dimensional ones.

Cite

Text

Rakhlin and Zhai. "Consistency of Interpolation with Laplace Kernels Is a High-Dimensional Phenomenon." Conference on Learning Theory, 2019.

Markdown

[Rakhlin and Zhai. "Consistency of Interpolation with Laplace Kernels Is a High-Dimensional Phenomenon." Conference on Learning Theory, 2019.](https://mlanthology.org/colt/2019/rakhlin2019colt-consistency/)

BibTeX

@inproceedings{rakhlin2019colt-consistency,
  title     = {{Consistency of Interpolation with Laplace Kernels Is a High-Dimensional Phenomenon}},
  author    = {Rakhlin, Alexander and Zhai, Xiyu},
  booktitle = {Conference on Learning Theory},
  year      = {2019},
  pages     = {2595-2623},
  volume    = {99},
  url       = {https://mlanthology.org/colt/2019/rakhlin2019colt-consistency/}
}