Asymptotics of Gaussian Regularized Least Squares

Abstract

We consider regularized least-squares (RLS) with a Gaussian kernel. We prove that if we let the Gaussian bandwidth σ → ∞ while letting the regularization parameter λ → 0, the RLS solution tends to a polynomial whose order is controlled by the rielative rates of decay of 1 σ2 and λ: if λ = σ−(2k+1), then, as σ → ∞, the RLS solution tends to the kth order polynomial with minimal empirical error. We illustrate the result with an example.

Cite

Text

Lippert and Rifkin. "Asymptotics of Gaussian Regularized Least Squares." Neural Information Processing Systems, 2005.

Markdown

[Lippert and Rifkin. "Asymptotics of Gaussian Regularized Least Squares." Neural Information Processing Systems, 2005.](https://mlanthology.org/neurips/2005/lippert2005neurips-asymptotics/)

BibTeX

@inproceedings{lippert2005neurips-asymptotics,
  title     = {{Asymptotics of Gaussian Regularized Least Squares}},
  author    = {Lippert, Ross and Rifkin, Ryan},
  booktitle = {Neural Information Processing Systems},
  year      = {2005},
  pages     = {803-810},
  url       = {https://mlanthology.org/neurips/2005/lippert2005neurips-asymptotics/}
}