Clustering-Guided Federated Learning of Representations
Abstract
Federated self-supervised learning (FedSSL) methods have proven to be very useful in learning unlabeled data that is distributed to multiple clients, possibly heterogeneously. However, there is still a lot of room for improvement for FedSSL methods, especially for the case of highly heterogeneous data and a large number of classes. In this paper, we introduce federated representation learning through clustering (FedRLC) scheme that utilizes i) a crossed KL divergence loss with a data selection strategy during local training and ii) a dynamic upload on local cluster centers during communication updates. Experimental results show that FedRLC achieves state-of-the-art results on widely used benchmarks even with highly heterogeneous settings and datasets with a large number of classes such as CIFAR-100.
Cite
Text
Miao and Koyuncu. "Clustering-Guided Federated Learning of Representations." ICML 2023 Workshops: FL, 2023.Markdown
[Miao and Koyuncu. "Clustering-Guided Federated Learning of Representations." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/miao2023icmlw-clusteringguided/)BibTeX
@inproceedings{miao2023icmlw-clusteringguided,
title = {{Clustering-Guided Federated Learning of Representations}},
author = {Miao, Runxuan and Koyuncu, Erdem},
booktitle = {ICML 2023 Workshops: FL},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/miao2023icmlw-clusteringguided/}
}