Heterogeneity for the Win: One-Shot Federated Clustering
Abstract
In this work, we explore the unique challenges—and opportunities—of unsupervised federated learning (FL). We develop and analyze a one-shot federated clustering scheme, kfed, based on the widely-used Lloyd’s method for $k$-means clustering. In contrast to many supervised problems, we show that the issue of statistical heterogeneity in federated networks can in fact benefit our analysis. We analyse kfed under a center separation assumption and compare it to the best known requirements of its centralized counterpart. Our analysis shows that in heterogeneous regimes where the number of clusters per device $(k’)$ is smaller than the total number of clusters over the network $k$, $(k’\le \sqrt{k})$, we can use heterogeneity to our advantage—significantly weakening the cluster separation requirements for kfed. From a practical viewpoint, kfed also has many desirable properties: it requires only round of communication, can run asynchronously, and can handle partial participation or node/network failures. We motivate our analysis with experiments on common FL benchmarks, and highlight the practical utility of one-shot clustering through use-cases in personalized FL and device sampling.
Cite
Text
Dennis et al. "Heterogeneity for the Win: One-Shot Federated Clustering." International Conference on Machine Learning, 2021.Markdown
[Dennis et al. "Heterogeneity for the Win: One-Shot Federated Clustering." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/dennis2021icml-heterogeneity/)BibTeX
@inproceedings{dennis2021icml-heterogeneity,
title = {{Heterogeneity for the Win: One-Shot Federated Clustering}},
author = {Dennis, Don Kurian and Li, Tian and Smith, Virginia},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {2611-2620},
volume = {139},
url = {https://mlanthology.org/icml/2021/dennis2021icml-heterogeneity/}
}