Error Analysis of Laplacian Eigenmaps for Semi-Supervised Learning
Abstract
We study the error and sample complexity of semi-supervised learning by Laplacian Eignmaps at the limit of infinite unlabeled data. We provide a bound on the error, and show that it is controlled by the graph Laplacian regularizer. Our analysis also gives guidance to the choice of the number of eigenvectors $k$ to use: when the data lies on a $d$-dimensional domain, the optimal choice of $k$ is of order $(n/\log(n))^{\frac{d}{d+2}}$, yielding an asymptotic error rate of $(n/\log(n))^{-\frac{2}{2+d}}$.
Cite
Text
Zhou and Srebro. "Error Analysis of Laplacian Eigenmaps for Semi-Supervised Learning." Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, 2011.Markdown
[Zhou and Srebro. "Error Analysis of Laplacian Eigenmaps for Semi-Supervised Learning." Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, 2011.](https://mlanthology.org/aistats/2011/zhou2011aistats-error/)BibTeX
@inproceedings{zhou2011aistats-error,
title = {{Error Analysis of Laplacian Eigenmaps for Semi-Supervised Learning}},
author = {Zhou, Xueyuan and Srebro, Nathan},
booktitle = {Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics},
year = {2011},
pages = {901-908},
volume = {15},
url = {https://mlanthology.org/aistats/2011/zhou2011aistats-error/}
}