A First-Order Primal-Dual Method with Adaptivity to Local Smoothness

Abstract

We consider the problem of finding a saddle point for the convex-concave objective $\min_x \max_y f(x) + \langle Ax, y\rangle - g^*(y)$, where $f$ is a convex function with locally Lipschitz gradient and $g$ is convex and possibly non-smooth. We propose an adaptive version of the Condat-Vũ algorithm, which alternates between primal gradient steps and dual proximal steps. The method achieves stepsize adaptivity through a simple rule involving $\|A\|$ and the norm of recently computed gradients of $f$. Under standard assumptions, we prove an $\mathcal{O}(k^{-1})$ ergodic convergence rate. Furthermore, when $f$ is also locally strongly convex and $A$ has full row rank we show that our method converges with a linear rate. Numerical experiments are provided for illustrating the practical performance of the algorithm.

Cite

Text

Vladarean et al. "A First-Order Primal-Dual Method with Adaptivity to Local Smoothness." Neural Information Processing Systems, 2021.

Markdown

[Vladarean et al. "A First-Order Primal-Dual Method with Adaptivity to Local Smoothness." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/vladarean2021neurips-firstorder/)

BibTeX

@inproceedings{vladarean2021neurips-firstorder,
  title     = {{A First-Order Primal-Dual Method with Adaptivity to Local Smoothness}},
  author    = {Vladarean, Maria-Luiza and Malitsky, Yura and Cevher, Volkan},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/vladarean2021neurips-firstorder/}
}