Fast and Provable ADMM for Learning with Generative Priors

Abstract

In this work, we propose a (linearized) Alternating Direction Method-of-Multipliers (ADMM) algorithm for minimizing a convex function subject to a nonconvex constraint. We focus on the special case where such constraint arises from the specification that a variable should lie in the range of a neural network. This is motivated by recent successful applications of Generative Adversarial Networks (GANs) in tasks like compressive sensing, denoising and robustness against adversarial examples. The derived rates for our algorithm are characterized in terms of certain geometric properties of the generator network, which we show hold for feedforward architectures, under mild assumptions. Unlike gradient descent (GD), it can efficiently handle non-smooth objectives as well as exploit efficient partial minimization procedures, thus being faster in many practical scenarios.

Cite

Text

Latorre et al. "Fast and Provable ADMM for Learning with Generative Priors." Neural Information Processing Systems, 2019.

Markdown

[Latorre et al. "Fast and Provable ADMM for Learning with Generative Priors." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/latorre2019neurips-fast/)

BibTeX

@inproceedings{latorre2019neurips-fast,
  title     = {{Fast and Provable ADMM for Learning with Generative Priors}},
  author    = {Latorre, Fabian and Eftekhari, Armin and Cevher, Volkan},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {12027-12039},
  url       = {https://mlanthology.org/neurips/2019/latorre2019neurips-fast/}
}