Highly Smooth Minimization of Non-Smooth Problems
Abstract
We establish improved rates for structured \emph{non-smooth} optimization problems by means of near-optimal higher-order accelerated methods. In particular, given access to a standard oracle model that provides a $p^{th}$ order Taylor expansion of a \emph{smoothed} version of the function, we show how to achieve $\eps$-optimality for the \emph{original} problem in $\tilde{O}_p\pa{\eps^{-\frac{2p+2}{3p+1}}}$ calls to the oracle. Furthermore, when $p=3$, we provide an efficient implementation of the near-optimal accelerated scheme that achieves an $O(\eps^{-4/5})$ iteration complexity, where each iteration requires $\tilde{O}(1)$ calls to a linear system solver. Thus, we go beyond the previous $O(\eps^{-1})$ barrier in terms of $\eps$ dependence, and in the case of $\ell_\infty$ regression and $\ell_1$-SVM, we establish overall improvements for some parameter settings in the moderate-accuracy regime. Our results also lead to improved high-accuracy rates for minimizing a large class of convex quartic polynomials.
Cite
Text
Bullins. "Highly Smooth Minimization of Non-Smooth Problems." Conference on Learning Theory, 2020.Markdown
[Bullins. "Highly Smooth Minimization of Non-Smooth Problems." Conference on Learning Theory, 2020.](https://mlanthology.org/colt/2020/bullins2020colt-highly/)BibTeX
@inproceedings{bullins2020colt-highly,
title = {{Highly Smooth Minimization of Non-Smooth Problems}},
author = {Bullins, Brian},
booktitle = {Conference on Learning Theory},
year = {2020},
pages = {988-1030},
volume = {125},
url = {https://mlanthology.org/colt/2020/bullins2020colt-highly/}
}