Locally Private K-Means in One Round
Abstract
We provide an approximation algorithm for k-means clustering in the \emph{one-round} (aka \emph{non-interactive}) local model of differential privacy (DP). Our algorithm achieves an approximation ratio arbitrarily close to the best \emph{non private} approximation algorithm, improving upon previously known algorithms that only guarantee large (constant) approximation ratios. Furthermore, ours is the first constant-factor approximation algorithm for k-means that requires only \emph{one} round of communication in the local DP model, positively resolving an open question of Stemmer (SODA 2020). Our algorithmic framework is quite flexible; we demonstrate this by showing that it also yields a similar near-optimal approximation algorithm in the (one-round) shuffle DP model.
Cite
Text
Chang et al. "Locally Private K-Means in One Round." International Conference on Machine Learning, 2021.Markdown
[Chang et al. "Locally Private K-Means in One Round." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/chang2021icml-locally/)BibTeX
@inproceedings{chang2021icml-locally,
title = {{Locally Private K-Means in One Round}},
author = {Chang, Alisa and Ghazi, Badih and Kumar, Ravi and Manurangsi, Pasin},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {1441-1451},
volume = {139},
url = {https://mlanthology.org/icml/2021/chang2021icml-locally/}
}