Stabilized Proximal-Point Methods for Federated Optimization
Abstract
In developing efficient optimization algorithms, it is crucial to account for communication constraints—a significant challenge in modern Federated Learning. The best-known communication complexity among non-accelerated algorithms is achieved by DANE, a distributed proximal-point algorithm that solves local subproblems at each iteration and that can exploit second-order similarity among individual functions. However, to achieve such communication efficiency, the algorithm requires solving local subproblems sufficiently accurately resulting in slightly sub-optimal local complexity. Inspired by the hybrid-projection proximal-point method, in this work, we propose a novel distributed algorithm S-DANE. Compared to DANE, this method uses an auxiliary sequence of prox-centers while maintaining the same deterministic communication complexity. Moreover, the accuracy condition for solving the subproblem is milder, leading to enhanced local computation efficiency. Furthermore, S-DANE supports partial client participation and arbitrary stochastic local solvers, making it attractive in practice. We further accelerate S-DANE and show that the resulting algorithm achieves the best-known communication complexity among all existing methods for distributed convex optimization while still enjoying good local computation efficiency as S-DANE. Finally, we propose adaptive variants of both methods using line search, obtaining the first provably efficient adaptive algorithms that could exploit local second-order similarity without the prior knowledge of any parameters.
Cite
Text
Jiang et al. "Stabilized Proximal-Point Methods for Federated Optimization." Neural Information Processing Systems, 2024. doi:10.52202/079017-3164Markdown
[Jiang et al. "Stabilized Proximal-Point Methods for Federated Optimization." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/jiang2024neurips-stabilized/) doi:10.52202/079017-3164BibTeX
@inproceedings{jiang2024neurips-stabilized,
title = {{Stabilized Proximal-Point Methods for Federated Optimization}},
author = {Jiang, Xiaowen and Rodomanov, Anton and Stich, Sebastian U.},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-3164},
url = {https://mlanthology.org/neurips/2024/jiang2024neurips-stabilized/}
}