Efficiently Escaping Saddle Points in Bilevel Optimization

Abstract

Bilevel optimization is one of the fundamental problems in machine learning and optimization. Recent theoretical developments in bilevel optimization focus on finding the first-order stationary points for nonconvex-strongly-convex cases. In this paper, we analyze algorithms that can escape saddle points in nonconvex-strongly-convex bilevel optimization. Specifically, we show that the perturbed approximate implicit differentiation (AID) with a warm start strategy finds an $\epsilon$-approximate local minimum of bilevel optimization in $\tilde{O}(\epsilon^{-2})$ iterations with high probability. Moreover, we propose an inexact NEgative-curvature-Originated-from-Noise Algorithm (iNEON), an algorithm that can escape saddle point and find local minimum of stochastic bilevel optimization. As a by-product, we provide the first nonasymptotic analysis of perturbed multi-step gradient descent ascent (GDmax) algorithm that converges to local minimax point for minimax problems.

Cite

Text

Huang et al. "Efficiently Escaping Saddle Points in Bilevel Optimization." Journal of Machine Learning Research, 2025.

Markdown

[Huang et al. "Efficiently Escaping Saddle Points in Bilevel Optimization." Journal of Machine Learning Research, 2025.](https://mlanthology.org/jmlr/2025/huang2025jmlr-efficiently/)

BibTeX

@article{huang2025jmlr-efficiently,
  title     = {{Efficiently Escaping Saddle Points in Bilevel Optimization}},
  author    = {Huang, Minhui and Chen, Xuxing and Ji, Kaiyi and Ma, Shiqian and Lai, Lifeng},
  journal   = {Journal of Machine Learning Research},
  year      = {2025},
  pages     = {1-61},
  volume    = {26},
  url       = {https://mlanthology.org/jmlr/2025/huang2025jmlr-efficiently/}
}