K-Median Clustering via Metric Embedding: Towards Better Initialization with Differential Privacy

Abstract

In clustering algorithms, the choice of initial centers is crucial for the quality of the learned clusters. We propose a new initialization scheme for the $k$-median problem in the general metric space (e.g., discrete space induced by graphs), based on the construction of metric embedding tree structure of the data. We propose a novel and efficient search algorithm, for good initial centers that can be used subsequently for the local search algorithm. The so-called HST initialization method can produce initial centers achieving lower error than those from another popular method $k$-median++, also with higher efficiency when $k$ is not too small. Our HST initialization can also be easily extended to the setting of differential privacy (DP) to generate private initial centers. We show that the error of applying DP local search followed by our private HST initialization improves previous results on the approximation error, and approaches the lower bound within a small factor. Experiments demonstrate the effectiveness of our proposed methods.

Cite

Text

Fan et al. "K-Median Clustering via Metric Embedding: Towards Better Initialization with Differential Privacy." Neural Information Processing Systems, 2023.

Markdown

[Fan et al. "K-Median Clustering via Metric Embedding: Towards Better Initialization with Differential Privacy." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/fan2023neurips-kmedian/)

BibTeX

@inproceedings{fan2023neurips-kmedian,
  title     = {{K-Median Clustering via Metric Embedding: Towards Better Initialization with Differential Privacy}},
  author    = {Fan, Chenglin and Li, Ping and Li, Xiaoyun},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/fan2023neurips-kmedian/}
}