High Probability Complexity Bounds for Line Search Based on Stochastic Oracles
Abstract
We consider a line-search method for continuous optimization under a stochastic setting where the function values and gradients are available only through inexact probabilistic zeroth and first-order oracles. These oracles capture multiple standard settings including expected loss minimization and zeroth-order optimization. Moreover, our framework is very general and allows the function and gradient estimates to be biased. The proposed algorithm is simple to describe, easy to implement, and uses these oracles in a similar way as the standard deterministic line search uses exact function and gradient values. Under fairly general conditions on the oracles, we derive a high probability tail bound on the iteration complexity of the algorithm when applied to non-convex smooth functions. These results are stronger than those for other existing stochastic line search methods and apply in more general settings.
Cite
Text
Jin et al. "High Probability Complexity Bounds for Line Search Based on Stochastic Oracles." Neural Information Processing Systems, 2021.Markdown
[Jin et al. "High Probability Complexity Bounds for Line Search Based on Stochastic Oracles." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/jin2021neurips-high/)BibTeX
@inproceedings{jin2021neurips-high,
title = {{High Probability Complexity Bounds for Line Search Based on Stochastic Oracles}},
author = {Jin, Billy and Scheinberg, Katya and Xie, Miaolan},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/jin2021neurips-high/}
}