Green Federated Learning

Abstract

The amount of compute used in training state-of-the-art models is exponentially increasing (doubling every 10 months between 2015 and 2022), resulting in a large carbon footprint. Federated Learning (FL) can also be resource-intensive and have a significant carbon footprint, particularly when deployed at scale. Unlike centralized AI that can reliably tap into renewables at strategically placed data centers, cross-device FL may leverage as many as hundreds of millions of globally distributed end-user devices with diverse energy sources. Green AI is a novel and important research area where carbon footprint is regarded as an evaluation criterion for AI, alongside accuracy, convergence speed, and other metrics. In this paper, we propose the concept of Green FL, which involves optimizing FL parameters and making design choices to minimize carbon emissions consistent with competitive performance and training time. First, we adopt a data-driven approach to quantify the carbon emissions of FL by directly measuring real-world at-scale FL tasks running on millions of phones. Second, we present challenges, guidelines, and lessons learned from studying the trade-off between energy efficiency, performance, and time-to-train in a production FL system.

Cite

Text

Yousefpour et al. "Green Federated Learning." ICML 2023 Workshops: FL, 2023.

Markdown

[Yousefpour et al. "Green Federated Learning." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/yousefpour2023icmlw-green/)

BibTeX

@inproceedings{yousefpour2023icmlw-green,
  title     = {{Green Federated Learning}},
  author    = {Yousefpour, Ashkan and Guo, Shen and Shenoy, Ashish and Ghosh, Sayan and Stock, Pierre and Maeng, Kiwan and Krüger, Schalk-Willem and Rabbat, Michael and Wu, Carole-Jean and Mironov, Ilya},
  booktitle = {ICML 2023 Workshops: FL},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/yousefpour2023icmlw-green/}
}