A Fully First-Order Method for Stochastic Bilevel Optimization

Abstract

We consider stochastic unconstrained bilevel optimization problems when only the first-order gradient oracles are available. While numerous optimization methods have been proposed for tackling bilevel problems, existing methods either tend to require possibly expensive calculations regarding Hessians of lower-level objectives, or lack rigorous finite-time performance guarantees. In this work, we propose a Fully First-order Stochastic Approximation (F2SA) method, and study its non-asymptotic convergence properties. Specifically, we show that F2SA converges to an $\epsilon$-stationary solution of the bilevel problem after $\epsilon^{-7/2}, \epsilon^{-5/2}$, and $\epsilon^{-3/2}$ iterations (each iteration using $O(1)$ samples) when stochastic noises are in both level objectives, only in the upper-level objective, and not present (deterministic settings), respectively. We further show that if we employ momentum-assisted gradient estimators, the iteration complexities can be improved to $\epsilon^{-5/2}, \epsilon^{-4/2}$, and $\epsilon^{-3/2}$, respectively. We demonstrate even superior practical performance of the proposed method over existing second-order based approaches on MNIST data-hypercleaning experiments.

Cite

Text

Kwon et al. "A Fully First-Order Method for Stochastic Bilevel Optimization." International Conference on Machine Learning, 2023.

Markdown

[Kwon et al. "A Fully First-Order Method for Stochastic Bilevel Optimization." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/kwon2023icml-fully/)

BibTeX

@inproceedings{kwon2023icml-fully,
  title     = {{A Fully First-Order Method for Stochastic Bilevel Optimization}},
  author    = {Kwon, Jeongyeol and Kwon, Dohyun and Wright, Stephen and Nowak, Robert D},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {18083-18113},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/kwon2023icml-fully/}
}