Inexact Augmented Lagrangian Methods for Conic Optimization: Quadratic Growth and Linear Convergence

Abstract

Augmented Lagrangian Methods (ALMs) are widely employed in solving constrained optimizations, and some efficient solvers are developed based on this framework. Under the quadratic growth assumption, it is known that the dual iterates and the Karush–Kuhn–Tucker (KKT) residuals of ALMs applied to conic programs converge linearly. In contrast, the convergence rate of the primal iterates has remained elusive. In this paper, we resolve this challenge by establishing new $\textit{quadratic growth}$ and $\textit{error bound}$ properties for primal and dual conic programs under the standard strict complementarity condition. Our main results reveal that both primal and dual iterates of the ALMs converge linearly contingent solely upon the assumption of strict complementarity and a bounded solution set. This finding provides a positive answer to an open question regarding the asymptotically linear convergence of the primal iterates of ALMs applied to conic optimization.

Cite

Text

Liao et al. "Inexact Augmented Lagrangian Methods for Conic Optimization: Quadratic Growth and Linear Convergence." Neural Information Processing Systems, 2024. doi:10.52202/079017-1297

Markdown

[Liao et al. "Inexact Augmented Lagrangian Methods for Conic Optimization: Quadratic Growth and Linear Convergence." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/liao2024neurips-inexact/) doi:10.52202/079017-1297

BibTeX

@inproceedings{liao2024neurips-inexact,
  title     = {{Inexact Augmented Lagrangian Methods for Conic Optimization: Quadratic Growth and Linear Convergence}},
  author    = {Liao, Feng-Yi and Ding, Lijun and Zheng, Yang},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-1297},
  url       = {https://mlanthology.org/neurips/2024/liao2024neurips-inexact/}
}