Faster Approximation Algorithms for K-Center via Data Reduction

Abstract

We study efficient algorithms for the Euclidean $k$-Center problem, focusing on the regime of large $k$. We take the approach of data reduction by considering $\alpha$-coreset, which is a small subset $S$ of the dataset $P$ such that any $\beta$-approximation on $S$ is an $(\alpha + \beta)$-approximation on $P$. We give efficient algorithms to construct coresets whose size is $k \cdot o(n)$, which immediately speeds up existing approximation algorithms. Notably, we obtain a near-linear time $O(1)$-approximation when $k = n^c$ for any $0 < c < 1$. We validate the performance of our coresets on real-world datasets with large $k$, and we observe that the coreset speeds up the well-known Gonzalez algorithm by up to $4$ times, while still achieving similar clustering cost. Technically, one of our coreset results is based on a new efficient construction of consistent hashing with competitive parameters. This general tool may be of independent interest for algorithm design in high dimensional Euclidean spaces.

Cite

Text

Filtser et al. "Faster Approximation Algorithms for K-Center via Data Reduction." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Filtser et al. "Faster Approximation Algorithms for K-Center via Data Reduction." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/filtser2025icml-faster/)

BibTeX

@inproceedings{filtser2025icml-faster,
  title     = {{Faster Approximation Algorithms for K-Center via Data Reduction}},
  author    = {Filtser, Arnold and Jiang, Shaofeng H.-C. and Li, Yi and Naredla, Anurag Murty and Psarros, Ioannis and Yang, Qiaoyuan and Zhang, Qin},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {17189-17202},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/filtser2025icml-faster/}
}