On Riemannian Stochastic Approximation Schemes with Fixed Step-Size

Abstract

This paper studies fixed step-size stochastic approximation (SA) schemes, including stochastic gradient schemes, in a Riemannian framework. It is motivated by several applications, where geodesics can be computed explicitly, and their use accelerates crude Euclidean methods. A fixed step-size scheme defines a family of time-homogeneous Markov chains, parametrized by the step-size. Here, using this formulation, non-asymptotic performance bounds are derived, under Lyapunov conditions. Then, for any step-size, the corresponding Markov chain is proved to admit a unique stationary distribution, and to be geometrically ergodic. This result gives rise to a family of stationary distributions indexed by the step-size, which is further shown to converge to a Dirac measure, concentrated at the solution of the problem at hand, as the step-size goes to $0$. Finally, the asymptotic rate of this convergence is established, through an asymptotic expansion of the bias, and a central limit theorem.

Cite

Text

Durmus et al. "On Riemannian Stochastic Approximation Schemes with Fixed Step-Size." Artificial Intelligence and Statistics, 2021.

Markdown

[Durmus et al. "On Riemannian Stochastic Approximation Schemes with Fixed Step-Size." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/durmus2021aistats-riemannian/)

BibTeX

@inproceedings{durmus2021aistats-riemannian,
  title     = {{On Riemannian Stochastic Approximation Schemes with Fixed Step-Size}},
  author    = {Durmus, Alain and Jiménez, Pablo and Moulines, Eric and Said, Salem},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2021},
  pages     = {1018-1026},
  volume    = {130},
  url       = {https://mlanthology.org/aistats/2021/durmus2021aistats-riemannian/}
}