Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering
Abstract
The K-subspaces (KSS) method is a generalization of the K-means method for subspace clustering. In this work, we present local convergence analysis and a recovery guarantee for KSS, assuming data are generated by the semi-random union of subspaces model, where $N$ points are randomly sampled from $K \ge 2$ overlapping subspaces. We show that if the initial assignment of the KSS method lies within a neighborhood of a true clustering, it converges at a superlinear rate and finds the correct clustering within $\Theta(\log\log N)$ iterations with high probability. Moreover, we propose a thresholding inner-product based spectral method for initialization and prove that it produces a point in this neighborhood. We also present numerical results of the studied method to support our theoretical developments.
Cite
Text
Wang et al. "Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering." International Conference on Machine Learning, 2022.Markdown
[Wang et al. "Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/wang2022icml-convergence/)BibTeX
@inproceedings{wang2022icml-convergence,
title = {{Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering}},
author = {Wang, Peng and Liu, Huikang and So, Anthony Man-Cho and Balzano, Laura},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {22884-22918},
volume = {162},
url = {https://mlanthology.org/icml/2022/wang2022icml-convergence/}
}