An Alternating Optimization Method for Bilevel Problems Under the Polyak-Łojasiewicz Condition
Abstract
Bilevel optimization has recently regained interest owing to its applications in emerging machine learning fields such as hyperparameter optimization, meta-learning, and reinforcement learning. Recent results have shown that simple alternating (implicit) gradient-based algorithms can match the convergence rate of single-level gradient descent (GD) when addressing bilevel problems with a strongly convex lower-level objective. However, it remains unclear whether this result can be generalized to bilevel problems beyond this basic setting. In this paper, we first introduce a stationary metric for the considered bilevel problems, which generalizes the existing metric, for a nonconvex lower-level objective that satisfies the Polyak-Łojasiewicz (PL) condition. We then propose a Generalized ALternating mEthod for bilevel opTimization (GALET) tailored to BLO with convex PL LL problem and establish that GALET achieves an $\epsilon$-stationary point for the considered problem within $\tilde{\cal O}(\epsilon^{-1})$ iterations, which matches the iteration complexity of GD for single-level smooth nonconvex problems.
Cite
Text
Xiao et al. "An Alternating Optimization Method for Bilevel Problems Under the Polyak-Łojasiewicz Condition." Neural Information Processing Systems, 2023.Markdown
[Xiao et al. "An Alternating Optimization Method for Bilevel Problems Under the Polyak-Łojasiewicz Condition." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/xiao2023neurips-alternating/)BibTeX
@inproceedings{xiao2023neurips-alternating,
title = {{An Alternating Optimization Method for Bilevel Problems Under the Polyak-Łojasiewicz Condition}},
author = {Xiao, Quan and Lu, Songtao and Chen, Tianyi},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/xiao2023neurips-alternating/}
}