Sampling-Based Nyström Approximation and Kernel Quadrature

Abstract

We analyze the Nyström approximation of a positive definite kernel associated with a probability measure. We first prove an improved error bound for the conventional Nyström approximation with i.i.d. sampling and singular-value decomposition in the continuous regime; the proof techniques are borrowed from statistical learning theory. We further introduce a refined selection of subspaces in Nyström approximation with theoretical guarantees that is applicable to non-i.i.d. landmark points. Finally, we discuss their application to convex kernel quadrature and give novel theoretical guarantees as well as numerical observations.

Cite

Text

Hayakawa et al. "Sampling-Based Nyström Approximation and Kernel Quadrature." International Conference on Machine Learning, 2023.

Markdown

[Hayakawa et al. "Sampling-Based Nyström Approximation and Kernel Quadrature." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/hayakawa2023icml-samplingbased/)

BibTeX

@inproceedings{hayakawa2023icml-samplingbased,
  title     = {{Sampling-Based Nyström Approximation and Kernel Quadrature}},
  author    = {Hayakawa, Satoshi and Oberhauser, Harald and Lyons, Terry},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {12678-12699},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/hayakawa2023icml-samplingbased/}
}