AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods
Abstract
We present AI-SARAH, a practical variant of SARAH. As a variant of SARAH, this algorithm employs the stochastic recursive gradient yet adjusts step-size based on local geometry. AI-SARAH implicitly computes step-size and efficiently estimates local Lipschitz smoothness of stochastic functions. It is fully adaptive, tune-free, straightforward to implement, and computationally efficient. We provide technical insight and intuitive illustrations on its design and convergence. We conduct extensive empirical analysis and demonstrate its strong performance compared with its classical counterparts and other state-of-the-art first-order methods in solving convex machine learning problems.
Cite
Text
Shi et al. "AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods." Transactions on Machine Learning Research, 2023.Markdown
[Shi et al. "AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/shi2023tmlr-aisarah/)BibTeX
@article{shi2023tmlr-aisarah,
title = {{AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods}},
author = {Shi, Zheng and Sadiev, Abdurakhmon and Loizou, Nicolas and Richtárik, Peter and Takáč, Martin},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/shi2023tmlr-aisarah/}
}