Quantized Decentralized Stochastic Learning over Directed Graphs

Abstract

We consider a decentralized stochastic learning problem where data points are distributed among computing nodes communicating over a directed graph. As the model size gets large, decentralized learning faces a major bottleneck that is the heavy communication load due to each node transmitting large messages (model updates) to its neighbors. To tackle this bottleneck, we propose the quantized decentralized stochastic learning algorithm over directed graphs that is based on the push-sum algorithm in decentralized consensus optimization. We prove that our algorithm achieves the same convergence rates of the decentralized stochastic learning algorithm with exact-communication for both convex and non-convex losses. Numerical evaluations corroborate our main theoretical results and illustrate significant speed-up compared to the exact-communication methods.

Cite

Text

Taheri et al. "Quantized Decentralized Stochastic Learning over Directed Graphs." International Conference on Machine Learning, 2020.

Markdown

[Taheri et al. "Quantized Decentralized Stochastic Learning over Directed Graphs." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/taheri2020icml-quantized/)

BibTeX

@inproceedings{taheri2020icml-quantized,
  title     = {{Quantized Decentralized Stochastic Learning over Directed Graphs}},
  author    = {Taheri, Hossein and Mokhtari, Aryan and Hassani, Hamed and Pedarsani, Ramtin},
  booktitle = {International Conference on Machine Learning},
  year      = {2020},
  pages     = {9324-9333},
  volume    = {119},
  url       = {https://mlanthology.org/icml/2020/taheri2020icml-quantized/}
}