Federated Learning with Online Adaptive Heterogeneous Local Models
Abstract
In Federated Learning, one of the biggest challenges is that client devices often have drastically different computation and communication resources for local updates. To this end, recent research efforts have focused on training heterogeneous local models that are obtained by adaptively pruning a shared global model. Despite the empirical success, theoretical analysis of the convergence of these heterogeneous FL algorithms remains an open question. In this paper, we establish sufficient conditions for any FL algorithms with heterogeneous local models to converge to a neighborhood of a stationary point of standard FL at a rate of $O(\frac{1}{\sqrt{Q}})$. For general smooth cost functions and under standard assumptions, our analysis illuminates two key factors impacting the optimality gap between heterogeneous and standard FL: pruning-induced noise and minimum coverage index, advocating a joint design strategy of local models' pruning masks in heterogeneous FL algorithms. The results are numerically validated on MNIST and CIFAR-10 datasets.
Cite
Text
Zhou et al. "Federated Learning with Online Adaptive Heterogeneous Local Models." NeurIPS 2022 Workshops: Federated_Learning, 2022.Markdown
[Zhou et al. "Federated Learning with Online Adaptive Heterogeneous Local Models." NeurIPS 2022 Workshops: Federated_Learning, 2022.](https://mlanthology.org/neuripsw/2022/zhou2022neuripsw-federated/)BibTeX
@inproceedings{zhou2022neuripsw-federated,
title = {{Federated Learning with Online Adaptive Heterogeneous Local Models}},
author = {Zhou, Hanhan and Lan, Tian and Venkataramani, Guru Prasadh and Ding, Wenbo},
booktitle = {NeurIPS 2022 Workshops: Federated_Learning},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/zhou2022neuripsw-federated/}
}