UniXGrad: A Universal, Adaptive Algorithm with Optimal Guarantees for Constrained Optimization
Abstract
We propose a novel adaptive, accelerated algorithm for the stochastic constrained convex optimization setting.Our method, which is inspired by the Mirror-Prox method, \emph{simultaneously} achieves the optimal rates for smooth/non-smooth problems with either deterministic/stochastic first-order oracles. This is done without any prior knowledge of the smoothness nor the noise properties of the problem. To the best of our knowledge, this is the first adaptive, unified algorithm that achieves the optimal rates in the constrained setting. We demonstrate the practical performance of our framework through extensive numerical experiments.
Cite
Text
Kavis et al. "UniXGrad: A Universal, Adaptive Algorithm with Optimal Guarantees for Constrained Optimization." Neural Information Processing Systems, 2019.Markdown
[Kavis et al. "UniXGrad: A Universal, Adaptive Algorithm with Optimal Guarantees for Constrained Optimization." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/kavis2019neurips-unixgrad/)BibTeX
@inproceedings{kavis2019neurips-unixgrad,
title = {{UniXGrad: A Universal, Adaptive Algorithm with Optimal Guarantees for Constrained Optimization}},
author = {Kavis, Ali and Levy, Kfir Y. and Bach, Francis and Cevher, Volkan},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {6260-6269},
url = {https://mlanthology.org/neurips/2019/kavis2019neurips-unixgrad/}
}