Adaptive Relaxed ADMM: Convergence Theory and Practical Implementation

Abstract

Many modern computer vision and machine learning applications rely on solving difficult optimization problems that involve non-differentiable objective functions and constraints. The alternating direction method of multipliers (ADMM) is a widely used approach to solve such problems. Relaxed ADMM is a generalization of ADMM that often achieves better performance, but its efficiency depends strongly on algorithm parameters that must be chosen by an expert user. We propose an adaptive method that automatically tunes the key algorithm parameters to achieve optimal performance without user oversight. Inspired by recent work on adaptivity, the proposed adaptive relaxed ADMM (ARADMM) is derived by assuming a Barzilai-Borwein style linear gradient. A detailed convergence analysis of ARADMM is provided, and numerical results on several applications demonstrate fast practical convergence.

Cite

Text

Xu et al. "Adaptive Relaxed ADMM: Convergence Theory and Practical Implementation." Conference on Computer Vision and Pattern Recognition, 2017. doi:10.1109/CVPR.2017.765

Markdown

[Xu et al. "Adaptive Relaxed ADMM: Convergence Theory and Practical Implementation." Conference on Computer Vision and Pattern Recognition, 2017.](https://mlanthology.org/cvpr/2017/xu2017cvpr-adaptive/) doi:10.1109/CVPR.2017.765

BibTeX

@inproceedings{xu2017cvpr-adaptive,
  title     = {{Adaptive Relaxed ADMM: Convergence Theory and Practical Implementation}},
  author    = {Xu, Zheng and Figueiredo, Mario A. T. and Yuan, Xiaoming and Studer, Christoph and Goldstein, Tom},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2017},
  doi       = {10.1109/CVPR.2017.765},
  url       = {https://mlanthology.org/cvpr/2017/xu2017cvpr-adaptive/}
}