Distributed Saddle-Point Problems Under Data Similarity
Abstract
We study solution methods for (strongly-)convex-(strongly)-concave Saddle-Point Problems (SPPs) over networks of two type--master/workers (thus centralized) architectures and mesh (thus decentralized) networks. The local functions at each node are assumed to be \textit{similar}, due to statistical data similarity or otherwise. We establish lower complexity bounds for a fairly general class of algorithms solving the SPP. We show that a given suboptimality $\epsilon>0$ is achieved over master/workers networks in $\Omega\big(\Delta\cdot \delta/\mu\cdot \log (1/\varepsilon)\big)$ rounds of communications, where $\delta>0$ measures the degree of similarity of the local functions, $\mu$ is their strong convexity constant, and $\Delta$ is the diameter of the network. The lower communication complexity bound over mesh networks reads $\Omega\big(1/{\sqrt{\rho}} \cdot {\delta}/{\mu}\cdot\log (1/\varepsilon)\big)$, where $\rho$ is the (normalized) eigengap of the gossip matrix used for the communication between neighbouring nodes. We then propose algorithms matching the lower bounds over either types of networks (up to log-factors). We assess the effectiveness of the proposed algorithms on a robust regression problem.
Cite
Text
Beznosikov et al. "Distributed Saddle-Point Problems Under Data Similarity." Neural Information Processing Systems, 2021.Markdown
[Beznosikov et al. "Distributed Saddle-Point Problems Under Data Similarity." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/beznosikov2021neurips-distributed/)BibTeX
@inproceedings{beznosikov2021neurips-distributed,
title = {{Distributed Saddle-Point Problems Under Data Similarity}},
author = {Beznosikov, Aleksandr and Scutari, Gesualdo and Rogozin, Alexander and Gasnikov, Alexander},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/beznosikov2021neurips-distributed/}
}