EFL: Elastic Federated Learning on Non-IID Data

Abstract

Federated learning involves training machine learning models over devices or data silos, such as edge processors or data warehouses while keeping the data local. However, training in heterogeneous and potentially massive networks introduces bias into the system, originating from the non-IID data and the low participation rate. In this paper, we propose Elastic Federated Learning (EFL), an unbiased federated training framework capable of tackling the heterogeneity in the system. EFL extends lifelong learning to realistic federated settings, makes the most informative parameters less volatile during training, and utilizes incomplete local updates. It is also an efficient and effective algorithm that compresses upstream and downstream communications with a convergence guarantee. We empirically demonstrate the efficacy of our framework on a variety of non-IID datasets and show the competitive performance of the algorithm on robustness and efficiency.

Cite

Text

Ma et al. "EFL: Elastic Federated Learning on Non-IID Data." Proceedings of The 1st Conference on Lifelong Learning Agents, 2022.

Markdown

[Ma et al. "EFL: Elastic Federated Learning on Non-IID Data." Proceedings of The 1st Conference on Lifelong Learning Agents, 2022.](https://mlanthology.org/collas/2022/ma2022collas-efl/)

BibTeX

@inproceedings{ma2022collas-efl,
  title     = {{EFL: Elastic Federated Learning on Non-IID Data}},
  author    = {Ma, Zichen and Lu, Yu and Li, Wenye and Cui, Shuguang},
  booktitle = {Proceedings of The 1st Conference on Lifelong Learning Agents},
  year      = {2022},
  pages     = {92-115},
  volume    = {199},
  url       = {https://mlanthology.org/collas/2022/ma2022collas-efl/}
}