Resource-Efficient Federated Learning

Abstract

Federated Learning (FL) is a distributed training paradigm that avoids sharing the users’ private data. FL has presented unique challenges in dealing with data, device and user heterogeneity which impact both model quality and training time. The impact is exacerbated by the scale of the deployments. More importantly, existing FL methods result in inefficient use of resources and prolonged training times. In this work, we propose, REFL, to systematically address the question of resource efficiency in FL, showing the benefits of intelligent participant selection, and incorporation of updates from straggling participants. REFL is a resource-efficient federated learning system that maximizes FL systems’ resource efficiency without compromising statistical and system efficiency. REFL is released as open source at https://github.com/ahmedcs/REFL.

Cite

Text

Abdelmoniem et al. "Resource-Efficient Federated Learning." ICML 2023 Workshops: FL, 2023.

Markdown

[Abdelmoniem et al. "Resource-Efficient Federated Learning." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/abdelmoniem2023icmlw-resourceefficient/)

BibTeX

@inproceedings{abdelmoniem2023icmlw-resourceefficient,
  title     = {{Resource-Efficient Federated Learning}},
  author    = {Abdelmoniem, Ahmed M. and Sahu, Atal Narayan and Canini, Marco and Fahmy, Suhaib A.},
  booktitle = {ICML 2023 Workshops: FL},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/abdelmoniem2023icmlw-resourceefficient/}
}