A Constant-Factor Bi-Criteria Approximation Guarantee for K-Means++
Abstract
This paper studies the $k$-means++ algorithm for clustering as well as the class of $D^\ell$ sampling algorithms to which $k$-means++ belongs. It is shown that for any constant factor $\beta > 1$, selecting $\beta k$ cluster centers by $D^\ell$ sampling yields a constant-factor approximation to the optimal clustering with $k$ centers, in expectation and without conditions on the dataset. This result extends the previously known $O(\log k)$ guarantee for the case $\beta = 1$ to the constant-factor bi-criteria regime. It also improves upon an existing constant-factor bi-criteria result that holds only with constant probability.
Cite
Text
Wei. "A Constant-Factor Bi-Criteria Approximation Guarantee for K-Means++." Neural Information Processing Systems, 2016.Markdown
[Wei. "A Constant-Factor Bi-Criteria Approximation Guarantee for K-Means++." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/wei2016neurips-constantfactor/)BibTeX
@inproceedings{wei2016neurips-constantfactor,
title = {{A Constant-Factor Bi-Criteria Approximation Guarantee for K-Means++}},
author = {Wei, Dennis},
booktitle = {Neural Information Processing Systems},
year = {2016},
pages = {604-612},
url = {https://mlanthology.org/neurips/2016/wei2016neurips-constantfactor/}
}