Path Length Bounds for Gradient Descent and Flow
Abstract
We derive bounds on the path length $\zeta$ of gradient descent (GD) and gradient flow (GF) curves for various classes of smooth convex and nonconvex functions. Among other results, we prove that: (a) if the iterates are linearly convergent with factor $(1-c)$, then $\zeta$ is at most $\mathcal{O}(1/c)$; (b) under the Polyak-Kurdyka-\L ojasiewicz (PKL) condition, $\zeta$ is at most $\mathcal{O}(\sqrt{\kappa})$, where $\kappa$ is the condition number, and at least $\widetilde\Omega(\sqrt{d} \wedge \kappa^{1/4})$; (c) for quadratics, $\zeta$ is $\Theta(\min\{\sqrt{d},\sqrt{\log \kappa}\})$ and in some cases can be independent of $\kappa$; (d) assuming just convexity, $\zeta$ can be at most $2^{4d\log d}$; (e) for separable quasiconvex functions, $\zeta$ is ${\Theta}(\sqrt{d})$. Thus, we advance current understanding of the properties of GD and GF curves beyond rates of convergence. We expect our techniques to facilitate future studies for other algorithms.
Cite
Text
Gupta et al. "Path Length Bounds for Gradient Descent and Flow." Journal of Machine Learning Research, 2021.Markdown
[Gupta et al. "Path Length Bounds for Gradient Descent and Flow." Journal of Machine Learning Research, 2021.](https://mlanthology.org/jmlr/2021/gupta2021jmlr-path/)BibTeX
@article{gupta2021jmlr-path,
title = {{Path Length Bounds for Gradient Descent and Flow}},
author = {Gupta, Chirag and Balakrishnan, Sivaraman and Ramdas, Aaditya},
journal = {Journal of Machine Learning Research},
year = {2021},
pages = {1-63},
volume = {22},
url = {https://mlanthology.org/jmlr/2021/gupta2021jmlr-path/}
}