Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Abstract
We develop two new stochastic Gauss-Newton algorithms for solving a class of non-convex stochastic compositional optimization problems frequently arising in practice. We consider both the expectation and finite-sum settings under standard assumptions, and use both classical stochastic and SARAH estimators for approximating function values and Jacobians. In the expectation case, we establish $\BigO{\varepsilon^{-2}}$ iteration-complexity to achieve a stationary point in expectation and estimate the total number of stochastic oracle calls for both function value and its Jacobian, where $\varepsilon$ is a desired accuracy. In the finite sum case, we also estimate $\BigO{\varepsilon^{-2}}$ iteration-complexity and the total oracle calls with high probability. To our best knowledge, this is the first time such global stochastic oracle complexity is established for stochastic Gauss-Newton methods. Finally, we illustrate our theoretical results via two numerical examples on both synthetic and real datasets.
Cite
Text
Tran-Dinh et al. "Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization." International Conference on Machine Learning, 2020.Markdown
[Tran-Dinh et al. "Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/trandinh2020icml-stochastic/)BibTeX
@inproceedings{trandinh2020icml-stochastic,
title = {{Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization}},
author = {Tran-Dinh, Quoc and Pham, Nhan and Nguyen, Lam},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {9572-9582},
volume = {119},
url = {https://mlanthology.org/icml/2020/trandinh2020icml-stochastic/}
}