Asynchronous Federated Learning with Bidirectional Quantized Communications and Buffered Aggregation

Abstract

Asynchronous Federated Learning with Buffered Aggregation (FedBuff) is a state-of-the-art algorithm known for its efficiency and high scalability. However, it has a high communication cost, which has not been examined with quantized communications. To tackle this problem, we present a new algorithm (QAFeL), with a quantization scheme that establishes a shared "hidden'' state between the server and clients to avoid the error propagation caused by direct quantization. This approach allows for high precision while significantly reducing the data transmitted during client-server interactions. We provide theoretical convergence guarantees for QAFeLand corroborate our analysis with experiments on a standard benchmark.

Cite

Text

Ortega and Jafarkhani. "Asynchronous Federated Learning with Bidirectional Quantized Communications and Buffered Aggregation." ICML 2023 Workshops: FL, 2023.

Markdown

[Ortega and Jafarkhani. "Asynchronous Federated Learning with Bidirectional Quantized Communications and Buffered Aggregation." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/ortega2023icmlw-asynchronous/)

BibTeX

@inproceedings{ortega2023icmlw-asynchronous,
  title     = {{Asynchronous Federated Learning with Bidirectional Quantized Communications and Buffered Aggregation}},
  author    = {Ortega, Tomas and Jafarkhani, Hamid},
  booktitle = {ICML 2023 Workshops: FL},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/ortega2023icmlw-asynchronous/}
}