Are We There yet? Manifold Identification of Gradient-Related Proximal Methods

Abstract

In machine learning, models that generalize better often generate outputs that lie on a low-dimensional manifold. Recently, several works have separately shown finite-time manifold identification by some proximal methods. In this work we provide a unified view by giving a simple condition under which any proximal method using a constant step size can achieve finite-iteration manifold detection. For several key methods (FISTA, DRS, ADMM, SVRG, SAGA, and RDA) we give an iteration bound, characterized in terms of their variable convergence rate and a problem-dependent constant that indicates problem degeneracy. For popular models, this constant is related to certain data assumptions, which gives intuition as to when lower active set complexity may be expected in practice.

Cite

Text

Sun et al. "Are We There yet? Manifold Identification of Gradient-Related Proximal Methods." Artificial Intelligence and Statistics, 2019.

Markdown

[Sun et al. "Are We There yet? Manifold Identification of Gradient-Related Proximal Methods." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/sun2019aistats-we/)

BibTeX

@inproceedings{sun2019aistats-we,
  title     = {{Are We There yet? Manifold Identification of Gradient-Related Proximal Methods}},
  author    = {Sun, Yifan and Jeong, Halyun and Nutini, Julie and Schmidt, Mark},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2019},
  pages     = {1110-1119},
  volume    = {89},
  url       = {https://mlanthology.org/aistats/2019/sun2019aistats-we/}
}