An Efficient Nonconvex Reformulation of Stagewise Convex Optimization Problems

Abstract

Convex optimization problems with staged structure appear in several contexts, including optimal control, verification of deep neural networks, and isotonic regression. Off-the-shelf solvers can solve these problems but may scale poorly. We develop a nonconvex reformulation designed to exploit this staged structure. Our reformulation has only simple bound constraints, enabling solution via projected gradient methods and their accelerated variants. The method automatically generates a sequence of primal and dual feasible solutions to the original convex problem, making optimality certification easy. We establish theoretical properties of the nonconvex formulation, showing that it is (almost) free of spurious local minima and has the same global optimum as the convex problem. We modify projected gradient descent to avoid spurious local minimizers so it always converges to the global minimizer. For neural network verification, our approach obtains small duality gaps in only a few gradient steps. Consequently, it can provide tight duality gaps for many large-scale verification problems where both off-the-shelf and specialized solvers struggle.

Cite

Text

Bunel et al. "An Efficient Nonconvex Reformulation of Stagewise Convex Optimization Problems." Neural Information Processing Systems, 2020.

Markdown

[Bunel et al. "An Efficient Nonconvex Reformulation of Stagewise Convex Optimization Problems." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/bunel2020neurips-efficient/)

BibTeX

@inproceedings{bunel2020neurips-efficient,
  title     = {{An Efficient Nonconvex Reformulation of Stagewise Convex Optimization Problems}},
  author    = {Bunel, Rudy R and Hinder, Oliver and Bhojanapalli, Srinadh and Dvijotham, Krishnamurthy},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/bunel2020neurips-efficient/}
}