Conic Descent and Its Application to Memory-Efficient Optimization over Positive Semidefinite Matrices

Abstract

We present an extension of the conditional gradient method to problems whose feasible sets are convex cones. We provide a convergence analysis for the method and for variants with nonconvex objectives, and we extend the analysis to practical cases with effective line search strategies. For the specific case of the positive semidefinite cone, we present a memory-efficient version based on randomized matrix sketches and advocate a heuristic greedy step that greatly improves its practical performance. Numerical results on phase retrieval and matrix completion problems indicate that our method can offer substantial advantages over traditional conditional gradient and Burer-Monteiro approaches.

Cite

Text

Duchi et al. "Conic Descent and Its Application to Memory-Efficient Optimization over Positive Semidefinite Matrices." Neural Information Processing Systems, 2020.

Markdown

[Duchi et al. "Conic Descent and Its Application to Memory-Efficient Optimization over Positive Semidefinite Matrices." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/duchi2020neurips-conic/)

BibTeX

@inproceedings{duchi2020neurips-conic,
  title     = {{Conic Descent and Its Application to Memory-Efficient Optimization over Positive Semidefinite Matrices}},
  author    = {Duchi, John C. and Hinder, Oliver and Naber, Andrew and Ye, Yinyu},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/duchi2020neurips-conic/}
}