Towards Provably Personalized Federated Learning via Threshold-Clustering of Similar Clients

Abstract

Clustering clients with similar objectives together and learning a model per cluster is an intuitive and interpretable approach to personalization in federated learning (PFL). However, doing so with provable and optimal guarantees has remained an open challenge. In this work, we formalize personalized federated learning as a stochastic optimization problem where the stochastic gradients on a client may correspond to one of $K$ distributions. In such a setting, we show that using i) a simple thresholding based clustering algorithm, and ii) local client momentum obtains optimal convergence guarantees. In fact, our rates asymptotically match those obtained if we knew the true underlying clustering of the clients. Further, we extend our algorithm to the decentralized setting where each node performs clustering using itself as the center.

Cite

Text

Werner et al. "Towards Provably Personalized Federated Learning via Threshold-Clustering of Similar Clients." NeurIPS 2022 Workshops: Federated_Learning, 2022.

Markdown

[Werner et al. "Towards Provably Personalized Federated Learning via Threshold-Clustering of Similar Clients." NeurIPS 2022 Workshops: Federated_Learning, 2022.](https://mlanthology.org/neuripsw/2022/werner2022neuripsw-provably/)

BibTeX

@inproceedings{werner2022neuripsw-provably,
  title     = {{Towards Provably Personalized Federated Learning via Threshold-Clustering of Similar Clients}},
  author    = {Werner, Mariel and He, Lie and Karimireddy, Sai Praneeth and Jordan, Michael and Jaggi, Martin},
  booktitle = {NeurIPS 2022 Workshops: Federated_Learning},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/werner2022neuripsw-provably/}
}