ProxSkip for Stochastic Variational Inequalities: A Federated Learning Algorithm for Provable Communication Acceleration
Abstract
Recently Mishchenko et al. (2022) proposed and analyzed ProxSkip, a provably efficient method for minimizing the sum of a smooth $(f)$ and an expensive nonsmooth proximable $(R)$ function (i.e. $\min_{x \in \mathbb{R}^d} f(x) + R(x)$). The main advantage of ProxSkip, is that in the federated learning (FL) setting, offers provably an effective acceleration of communication complexity. This work extends this approach to the more general regularized variational inequality problems (VIP). In particular, we propose ProxSkip-VIP algorithm, which generalizes the original ProxSkip framework of Mishchenko et al. (2022) to VIP, and we provide convergence guarantees for a class of structured non-monotone problems. In the federated learning setting, we explain how our approach achieves acceleration in terms of the communication complexity over existing state-of-the-art FL algorithms.
Cite
Text
Zhang and Loizou. "ProxSkip for Stochastic Variational Inequalities: A Federated Learning Algorithm for Provable Communication Acceleration." NeurIPS 2022 Workshops: OPT, 2022.Markdown
[Zhang and Loizou. "ProxSkip for Stochastic Variational Inequalities: A Federated Learning Algorithm for Provable Communication Acceleration." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/zhang2022neuripsw-proxskip/)BibTeX
@inproceedings{zhang2022neuripsw-proxskip,
title = {{ProxSkip for Stochastic Variational Inequalities: A Federated Learning Algorithm for Provable Communication Acceleration}},
author = {Zhang, Siqi and Loizou, Nicolas},
booktitle = {NeurIPS 2022 Workshops: OPT},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/zhang2022neuripsw-proxskip/}
}