Non-Ergodic Alternating Proximal Augmented Lagrangian Algorithms with Optimal Rates
Abstract
We develop two new non-ergodic alternating proximal augmented Lagrangian algorithms (NEAPAL) to solve a class of nonsmooth constrained convex optimization problems. Our approach relies on a novel combination of the augmented Lagrangian framework, alternating/linearization scheme, Nesterov's acceleration techniques, and adaptive strategy for parameters. Our algorithms have several new features compared to existing methods. Firstly, they have a Nesterov's acceleration step on the primal variables compared to the dual one in several methods in the literature. Secondly, they achieve non-ergodic optimal convergence rates under standard assumptions, i.e. an $\mathcal{O}\left(\frac{1}{k}\right)$ rate without any smoothness or strong convexity-type assumption, or an $\mathcal{O}\left(\frac{1}{k^2}\right)$ rate under only semi-strong convexity, where $k$ is the iteration counter. Thirdly, they preserve or have better per-iteration complexity compared to existing algorithms. Fourthly, they can be implemented in a parallel fashion. Finally, all the parameters are adaptively updated without heuristic tuning. We verify our algorithms on different numerical examples and compare them with some state-of-the-art methods.
Cite
Text
Dinh. "Non-Ergodic Alternating Proximal Augmented Lagrangian Algorithms with Optimal Rates." Neural Information Processing Systems, 2018.Markdown
[Dinh. "Non-Ergodic Alternating Proximal Augmented Lagrangian Algorithms with Optimal Rates." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/dinh2018neurips-nonergodic/)BibTeX
@inproceedings{dinh2018neurips-nonergodic,
title = {{Non-Ergodic Alternating Proximal Augmented Lagrangian Algorithms with Optimal Rates}},
author = {Dinh, Quoc Tran},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {4811-4819},
url = {https://mlanthology.org/neurips/2018/dinh2018neurips-nonergodic/}
}