Bidirectional Adaptive Communication for Heterogeneous Distributed Learning

Abstract

Communication is a key bottleneck in distributed optimization, and, in particular, bandwidth and latency can be limiting factors when devices are connected over commodity networks, such as in Federated Learning. State-of-the-art techniques tackle these challenges by advanced compression techniques or delaying communication rounds according to predefined schedules. We present a new scheme that adaptively skips communication (broadcast and client uploads) by detecting slow-varying updates. The scheme automatically adjusts the communication frequency independently for each worker and the server. By utilizing an error-feedback mechanism~-- borrowed from the compression literature~--~we prove that the convergence rate is the same as for batch gradient descent %strongly-convex, in the convex and nonconvex smooth cases. We show that the total number of communication rounds between server and clients needed to achieve a targeted accuracy is reduced, even in the case when the data distribution is highly non-IID.

Cite

Text

Avdiukhin et al. "Bidirectional Adaptive Communication for Heterogeneous Distributed Learning." NeurIPS 2022 Workshops: OPT, 2022.

Markdown

[Avdiukhin et al. "Bidirectional Adaptive Communication for Heterogeneous Distributed Learning." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/avdiukhin2022neuripsw-bidirectional/)

BibTeX

@inproceedings{avdiukhin2022neuripsw-bidirectional,
  title     = {{Bidirectional Adaptive Communication for Heterogeneous Distributed Learning}},
  author    = {Avdiukhin, Dmitrii and Braverman, Vladimir and Ivkin, Nikita and Stich, Sebastian U},
  booktitle = {NeurIPS 2022 Workshops: OPT},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/avdiukhin2022neuripsw-bidirectional/}
}