On Finding Small Hyper-Gradients in Bilevel Optimization: Hardness Results and Improved Analysis
Abstract
Bilevel optimization reveals the inner structure of otherwise oblique optimization problems, such as hyperparameter tuning, neural architecture search, and meta-learning. A common goal in bilevel optimization is to minimize a hyper-objective that implicitly depends on the solution set of the lower-level function. Although this hyper-objective approach is widely used, its theoretical properties have not been thoroughly investigated in cases where \textit{the lower-level functions lack strong convexity}. In this work, we first provide hardness results to show that the goal of finding stationary points of the hyper-objective for nonconvex-convex bilevel optimization can be intractable for zero-respecting algorithms. Then we study a class of tractable nonconvex-nonconvex bilevel problems when the lower-level function satisfies the Polyak-Łojasiewicz (PL) condition. We show a simple first-order algorithm can achieve complexity bounds of $\tilde{\mathcal{O}}(\epsilon^{-2})$, $\tilde{\mathcal{O}}(\epsilon^{-4})$ and $\tilde{\mathcal{O}}(\epsilon^{-6})$ in the deterministic, partially stochastic, and fully stochastic setting respectively.
Cite
Text
Chen et al. "On Finding Small Hyper-Gradients in Bilevel Optimization: Hardness Results and Improved Analysis." Conference on Learning Theory, 2024.Markdown
[Chen et al. "On Finding Small Hyper-Gradients in Bilevel Optimization: Hardness Results and Improved Analysis." Conference on Learning Theory, 2024.](https://mlanthology.org/colt/2024/chen2024colt-finding/)BibTeX
@inproceedings{chen2024colt-finding,
title = {{On Finding Small Hyper-Gradients in Bilevel Optimization: Hardness Results and Improved Analysis}},
author = {Chen, Lesi and Xu, Jing and Zhang, Jingzhao},
booktitle = {Conference on Learning Theory},
year = {2024},
pages = {947-980},
volume = {247},
url = {https://mlanthology.org/colt/2024/chen2024colt-finding/}
}