Strong Consistency, Graph Laplacians, and the Stochastic Block Model

Abstract

Spectral clustering has become one of the most popular algorithms in data clustering and community detection. We study the performance of classical two-step spectral clustering via the graph Laplacian to learn the stochastic block model. Our aim is to answer the following question: when is spectral clustering via the graph Laplacian able to achieve strong consistency, i.e., the exact recovery of the underlying hidden communities? Our work provides an entrywise analysis (an $\ell_{\infty}$-norm perturbation bound) of the Fiedler eigenvector of both the unnormalized and the normalized Laplacian associated with the adjacency matrix sampled from the stochastic block model. We prove that spectral clustering is able to achieve exact recovery of the planted community structure under conditions that match the information-theoretic limits.

Cite

Text

Deng et al. "Strong Consistency, Graph Laplacians, and the Stochastic Block Model." Journal of Machine Learning Research, 2021.

Markdown

[Deng et al. "Strong Consistency, Graph Laplacians, and the Stochastic Block Model." Journal of Machine Learning Research, 2021.](https://mlanthology.org/jmlr/2021/deng2021jmlr-strong/)

BibTeX

@article{deng2021jmlr-strong,
  title     = {{Strong Consistency, Graph Laplacians, and the Stochastic Block Model}},
  author    = {Deng, Shaofeng and Ling, Shuyang and Strohmer, Thomas},
  journal   = {Journal of Machine Learning Research},
  year      = {2021},
  pages     = {1-44},
  volume    = {22},
  url       = {https://mlanthology.org/jmlr/2021/deng2021jmlr-strong/}
}