A Convergent Federated Clustering Algorithm Without Initial Condition

Abstract

In this paper, we define a new clustering framework for FL based on the (optimal) local models of the users: two users belong to the same cluster if their local models are close. We propose an algorithm, \emph{Successive Refine Federated Clustering Algorithm} (\texttt{SR-FCA}), that treats each user as a singleton cluster as an initialization, and then successively refine the cluster estimation via exploiting similarity with other users. In any intermediate step, \texttt{SR-FCA} uses an {\em error-tolerant} federated learning algorithm within each cluster to exploit simultaneous training and to correct clustering errors. Unlike some prominent prior works, such as ~\cite{ghosh_efficient_2021}, \texttt{SR-FCA} does not require any \emph{good} initialization (or warm start), both in theory and practice. We show that with proper choice of learning rate, \texttt{SR-FCA} incurs arbitrarily small clustering error. Additionally, \texttt{SR-FCA} does not require the knowledge of the number of clusters apriori like some prior works. We also validate the performance of our algorithm on real-world FL datasets including FEMNIST and Shakespeare in non-convex problems and show the benefits of \texttt{SR-FCA} over several baselines.

Cite

Text

Vardhan et al. "A Convergent Federated Clustering Algorithm Without Initial Condition." ICML 2023 Workshops: FL, 2023.

Markdown

[Vardhan et al. "A Convergent Federated Clustering Algorithm Without Initial Condition." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/vardhan2023icmlw-convergent/)

BibTeX

@inproceedings{vardhan2023icmlw-convergent,
  title     = {{A Convergent Federated Clustering Algorithm Without Initial Condition}},
  author    = {Vardhan, Harsh and Ghosh, Avishek and Mazumdar, Arya},
  booktitle = {ICML 2023 Workshops: FL},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/vardhan2023icmlw-convergent/}
}