ADMM and Accelerated ADMM as Continuous Dynamical Systems

Abstract

Recently, there has been an increasing interest in using tools from dynamical systems to analyze the behavior of simple optimization algorithms such as gradient descent and accelerated variants. This paper strengthens such connections by deriving the differential equations that model the continuous limit of the sequence of iterates generated by the alternating direction method of multipliers, as well as an accelerated variant. We employ the direct method of Lyapunov to analyze the stability of critical points of the dynamical systems and to obtain associated convergence rates.

Cite

Text

Franca et al. "ADMM and Accelerated ADMM as Continuous Dynamical Systems." International Conference on Machine Learning, 2018.

Markdown

[Franca et al. "ADMM and Accelerated ADMM as Continuous Dynamical Systems." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/franca2018icml-admm/)

BibTeX

@inproceedings{franca2018icml-admm,
  title     = {{ADMM and Accelerated ADMM as Continuous Dynamical Systems}},
  author    = {Franca, Guilherme and Robinson, Daniel and Vidal, Rene},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {1559-1567},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/franca2018icml-admm/}
}