K-Means++: Few More Steps Yield Constant Approximation
Abstract
The k-means++ algorithm of Arthur and Vassilvitskii (SODA 2007) is a state-of-the-art algorithm for solving the k-means clustering problem and is known to give an O(log k) approximation. Recently, Lattanzi and Sohler (ICML 2019) proposed augmenting k-means++ with O(k log log k) local search steps to yield a constant approximation (in expectation) to the k-means clustering problem. In this paper, we improve their analysis to show that, for any arbitrarily small constant epsilon > 0, with only epsilon * k additional local search steps, one can achieve a constant approximation guarantee (with high probability in k), resolving an open problem in their paper.
Cite
Text
Choo et al. "K-Means++: Few More Steps Yield Constant Approximation." International Conference on Machine Learning, 2020.Markdown
[Choo et al. "K-Means++: Few More Steps Yield Constant Approximation." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/choo2020icml-kmeans/)BibTeX
@inproceedings{choo2020icml-kmeans,
title = {{K-Means++: Few More Steps Yield Constant Approximation}},
author = {Choo, Davin and Grunau, Christoph and Portmann, Julian and Rozhon, Vaclav},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {1909-1917},
volume = {119},
url = {https://mlanthology.org/icml/2020/choo2020icml-kmeans/}
}