A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting
Abstract
We present a new method that includes three key components of distributed optimization and federated learning: variance reduction of stochastic gradients, partial participation, and compressed communication. We prove that the new method has optimal oracle complexity and state-of-the-art communication complexity in the partial participation setting. Regardless of the communication compression feature, our method successfully combines variance reduction and partial participation: we get the optimal oracle complexity, never need the participation of all nodes, and do not require the bounded gradients (dissimilarity) assumption.
Cite
Text
Tyurin and Richtarik. "A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting." Neural Information Processing Systems, 2023.Markdown
[Tyurin and Richtarik. "A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/tyurin2023neurips-computation/)BibTeX
@inproceedings{tyurin2023neurips-computation,
title = {{A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting}},
author = {Tyurin, Alexander and Richtarik, Peter},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/tyurin2023neurips-computation/}
}