Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings
Abstract
First-order convex minimization algorithms are currently the methods of choice for large-scale sparse – and more generally parsimonious – regression models. We pose the question on the limits of performance of black-box oriented methods for convex minimization in \em non-standard settings, where the regularity of the objective is measured in a norm not necessarily induced by the feasible domain. This question is studied for \ell_p/\ell_q-settings, and their matrix analogues (Schatten norms), where we find surprising gaps on lower bounds compared to state of the art methods. We propose a conjecture on the optimal convergence rates for these settings, for which a positive answer would lead to significant improvements on minimization algorithms for parsimonious regression models.
Cite
Text
Guzmán. "Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings." Annual Conference on Computational Learning Theory, 2015.Markdown
[Guzmán. "Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings." Annual Conference on Computational Learning Theory, 2015.](https://mlanthology.org/colt/2015/guzman2015colt-open/)BibTeX
@inproceedings{guzman2015colt-open,
title = {{Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings}},
author = {Guzmán, Cristóbal},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2015},
pages = {1761-1763},
url = {https://mlanthology.org/colt/2015/guzman2015colt-open/}
}