Kernelized Diffusion Maps
Abstract
Spectral clustering and diffusion maps are celebrated dimensionality reduction algorithms built on eigen-elements related to the diffusive structure of the data. The core of these procedures is the approximation of a Laplacian through a graph kernel approach, however this local average construction is known to be cursed by the high-dimension $d$. In this article, we build a different estimator of the Laplacian, via a reproducing kernel Hilbert spaces method, which adapts naturally to the regularity of the problem. We provide non-asymptotic statistical rates proving that the kernel estimator we build can circumvent the curse of dimensionality. Finally we discuss techniques (Nyström subsampling, Fourier features) that enable to reduce the computational cost of the estimator while not degrading its overall performance.
Cite
Text
Pillaud-Vivien and Bach. "Kernelized Diffusion Maps." Conference on Learning Theory, 2023.Markdown
[Pillaud-Vivien and Bach. "Kernelized Diffusion Maps." Conference on Learning Theory, 2023.](https://mlanthology.org/colt/2023/pillaudvivien2023colt-kernelized/)BibTeX
@inproceedings{pillaudvivien2023colt-kernelized,
title = {{Kernelized Diffusion Maps}},
author = {Pillaud-Vivien, Loucas and Bach, Francis},
booktitle = {Conference on Learning Theory},
year = {2023},
pages = {5236-5259},
volume = {195},
url = {https://mlanthology.org/colt/2023/pillaudvivien2023colt-kernelized/}
}