Fast Convergence of Greedy 2-Coordinate Updates for Optimizing with an Equality Constraint
Abstract
We consider minimizing a smooth function subject to an equality constraint. We analyze a greedy 2-coordinate update algorithm, and prove that greedy coordinate selection leads to faster convergence than random selection (under the Polyak-\L{}ojasiewicz assumption). Our simple analysis exploits am equivalence between the greedy 2-coordinate update and equality-constrained steepest descent in the L1-norm. Unlike previous 2-coordinate analyses, our convergence rate is dimension independent.
Cite
Text
Ramesh et al. "Fast Convergence of Greedy 2-Coordinate Updates for Optimizing with an Equality Constraint." NeurIPS 2022 Workshops: OPT, 2022.Markdown
[Ramesh et al. "Fast Convergence of Greedy 2-Coordinate Updates for Optimizing with an Equality Constraint." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/ramesh2022neuripsw-fast/)BibTeX
@inproceedings{ramesh2022neuripsw-fast,
title = {{Fast Convergence of Greedy 2-Coordinate Updates for Optimizing with an Equality Constraint}},
author = {Ramesh, Amrutha Varshini and Mishkin, Aaron and Schmidt, Mark},
booktitle = {NeurIPS 2022 Workshops: OPT},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/ramesh2022neuripsw-fast/}
}