Revisiting Ensembling in One-Shot Federated Learning

Abstract

Federated Learning (FL) is an appealing approach to training machine learning models without sharing raw data. However, standard FL algorithms are iterative and thus induce a significant communication cost. One-Shot FL (OFL) trades the iterative exchange of models between clients and the server with a single round of communication, thereby saving substantially on communication costs. Not surprisingly, OFL exhibits a performance gap in terms of accuracy with respect to FL, especially under high data heterogeneity. We introduce Fens, a novel federated ensembling scheme that approaches the accuracy of FL with the communication efficiency of OFL. Learning in Fens proceeds in two phases: first, clients train models locally and send them to the server, similar to OFL; second, clients collaboratively train a lightweight prediction aggregator model using FL. We showcase the effectiveness of Fens through exhaustive experiments spanning several datasets and heterogeneity levels. In the particular case of heterogeneously distributed CIFAR-10 dataset, Fens achieves up to a $26.9$% higher accuracy over SOTA OFL, being only $3.1$% lower than FL. At the same time, Fens incurs at most $4.3\times$ more communication than OFL, whereas FL is at least $10.9\times$ more communication-intensive than Fens.

Cite

Text

Allouah et al. "Revisiting Ensembling in One-Shot Federated Learning." Neural Information Processing Systems, 2024. doi:10.52202/079017-2188

Markdown

[Allouah et al. "Revisiting Ensembling in One-Shot Federated Learning." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/allouah2024neurips-revisiting/) doi:10.52202/079017-2188

BibTeX

@inproceedings{allouah2024neurips-revisiting,
  title     = {{Revisiting Ensembling in One-Shot Federated Learning}},
  author    = {Allouah, Youssef and Dhasade, Akash and Guerraoui, Rachid and Gupta, Nirupam and Kermarrec, Anne-Marie and Pinot, Rafael and Pires, Rafael and Sharma, Rishi},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2188},
  url       = {https://mlanthology.org/neurips/2024/allouah2024neurips-revisiting/}
}