Optimal First-Order Algorithms as a Function of Inequalities
Abstract
In this work, we present a novel algorithm design methodology that finds the optimal algorithm as a function of inequalities. Specifically, we restrict convergence analyses of algorithms to use a prespecified subset of inequalities, rather than utilizing all true inequalities, and find the optimal algorithm subject to this restriction. This methodology allows us to design algorithms with certain desired characteristics. As concrete demonstrations of this methodology, we find new state-of-the-art accelerated first-order gradient methods using randomized coordinate updates and backtracking line searches.
Cite
Text
Park and Ryu. "Optimal First-Order Algorithms as a Function of Inequalities." Journal of Machine Learning Research, 2024.Markdown
[Park and Ryu. "Optimal First-Order Algorithms as a Function of Inequalities." Journal of Machine Learning Research, 2024.](https://mlanthology.org/jmlr/2024/park2024jmlr-optimal/)BibTeX
@article{park2024jmlr-optimal,
title = {{Optimal First-Order Algorithms as a Function of Inequalities}},
author = {Park, Chanwoo and Ryu, Ernest K.},
journal = {Journal of Machine Learning Research},
year = {2024},
pages = {1-66},
volume = {25},
url = {https://mlanthology.org/jmlr/2024/park2024jmlr-optimal/}
}