Differentially Private K-Means with Constant Multiplicative Error

Abstract

We design new differentially private algorithms for the Euclidean k-means problem, both in the centralized model and in the local model of differential privacy. In both models, our algorithms achieve significantly improved error guarantees than the previous state-of-the-art. In addition, in the local model, our algorithm significantly reduces the number of interaction rounds. Although the problem has been widely studied in the context of differential privacy, all of the existing constructions achieve only super constant approximation factors. We present, for the first time, efficient private algorithms for the problem with constant multiplicative error. Furthermore, we show how to modify our algorithms so they compute private coresets for k-means clustering in both models.

Cite

Text

Stemmer and Kaplan. "Differentially Private K-Means with Constant Multiplicative Error." Neural Information Processing Systems, 2018.

Markdown

[Stemmer and Kaplan. "Differentially Private K-Means with Constant Multiplicative Error." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/stemmer2018neurips-differentially/)

BibTeX

@inproceedings{stemmer2018neurips-differentially,
  title     = {{Differentially Private K-Means with Constant Multiplicative Error}},
  author    = {Stemmer, Uri and Kaplan, Haim},
  booktitle = {Neural Information Processing Systems},
  year      = {2018},
  pages     = {5431-5441},
  url       = {https://mlanthology.org/neurips/2018/stemmer2018neurips-differentially/}
}