On the Performance of Gradient Tracking with Local Updates
Abstract
We study the decentralized optimization problem where a network of $n$ agents seeks to minimize the average of a set of heterogeneous non-convex cost functions distributedly. State-of-the-art decentralized algorithms like Exact Diffusion and Gradient Tracking~(GT) involve communicating every iteration. However, communication is expensive, resource intensive, and slow. This work analyzes a locally updated GT method (LU-GT), where agents perform local recursions before interacting with their neighbors. While local updates have been shown to reduce communication overhead in practice, their theoretical influence has not been fully characterized. We show LU-GT has the same communication complexity as the Federated Learning setting but allows for decentralized (symmetric) network topologies and prove that the number of local updates does not degrade the quality of the solution achieved by LU-GT.
Cite
Text
Nguyen et al. "On the Performance of Gradient Tracking with Local Updates." ICML 2023 Workshops: FL, 2023.Markdown
[Nguyen et al. "On the Performance of Gradient Tracking with Local Updates." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/nguyen2023icmlw-performance/)BibTeX
@inproceedings{nguyen2023icmlw-performance,
title = {{On the Performance of Gradient Tracking with Local Updates}},
author = {Nguyen, Edward Duc Hien and Alghunaim, Sulaiman A and Yuan, Kun and Uribe, Cesar A},
booktitle = {ICML 2023 Workshops: FL},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/nguyen2023icmlw-performance/}
}