On the Complexity of Finding Stationary Points of Smooth Functions in One Dimension

Abstract

We characterize the query complexity of finding stationary points of one-dimensional non-convex but smooth functions. We consider four settings, based on whether the algorithms under consideration are deterministic or randomized, and whether the oracle outputs $1^{\rm st}$-order or both $0^{\rm th}$- and $1^{\rm st}$-order information. Our results show that algorithms for this task provably benefit by incorporating either randomness or $0^{\rm th}$-order information. Our results also show that, for every dimension $d \geq 1$, gradient descent is optimal among deterministic algorithms using $1^{\rm st}$-order queries only.

Cite

Text

Chewi et al. "On the Complexity of Finding Stationary Points of Smooth Functions in One Dimension." Proceedings of The 34th International Conference on Algorithmic Learning Theory, 2023.

Markdown

[Chewi et al. "On the Complexity of Finding Stationary Points of Smooth Functions in One Dimension." Proceedings of The 34th International Conference on Algorithmic Learning Theory, 2023.](https://mlanthology.org/alt/2023/chewi2023alt-complexity/)

BibTeX

@inproceedings{chewi2023alt-complexity,
  title     = {{On the Complexity of Finding Stationary Points of Smooth Functions in One Dimension}},
  author    = {Chewi, Sinho and Bubeck, Sébastien and Salim, Adil},
  booktitle = {Proceedings of The 34th International Conference on Algorithmic Learning Theory},
  year      = {2023},
  pages     = {358-374},
  volume    = {201},
  url       = {https://mlanthology.org/alt/2023/chewi2023alt-complexity/}
}