Spectral Algorithms for Learning and Clustering
Abstract
Roughly speaking, spectral algorithms are methods that rely on the principal components (typically singular values and singular vectors) of an input matrix (or graph). The spectrum of a matrix captures many interesting properties in surprising ways. Spectral methods are already used for unsupervised learning, image segmentation, to improve precision and recall in databases and broadly for information retrieval. The common component of these methods is the subspace of a small number of singular vectors of the data, by means of the Singular Value Decomposition (SVD). We describe SVD from a geometric perspective and then focus on its central role in efficient algorithms for (a) the classical problem of “learning” a mixture of Gaussians in R^n and (b) clustering a set of objects from pairwise similarities.
Cite
Text
Vempala. "Spectral Algorithms for Learning and Clustering." Annual Conference on Computational Learning Theory, 2007. doi:10.1007/978-3-540-72927-3_2Markdown
[Vempala. "Spectral Algorithms for Learning and Clustering." Annual Conference on Computational Learning Theory, 2007.](https://mlanthology.org/colt/2007/vempala2007colt-spectral/) doi:10.1007/978-3-540-72927-3_2BibTeX
@inproceedings{vempala2007colt-spectral,
title = {{Spectral Algorithms for Learning and Clustering}},
author = {Vempala, Santosh S.},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2007},
pages = {3-4},
doi = {10.1007/978-3-540-72927-3_2},
url = {https://mlanthology.org/colt/2007/vempala2007colt-spectral/}
}