A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton
Abstract
In recent years, a variety of gradient-based bi-level optimization methods have been developed for learning tasks. However, theoretical guarantees of these existing approaches often heavily rely on the simplification that for each fixed upper-level variable, the lower-level solution must be a singleton (a.k.a., Lower-Level Singleton, LLS). In this work, by formulating bi-level models from the optimistic viewpoint and aggregating hierarchical objective information, we establish Bi-level Descent Aggregation (BDA), a flexible and modularized algorithmic framework for bi-level programming. Theoretically, we derive a new methodology to prove the convergence of BDA without the LLS condition. Furthermore, we improve the convergence properties of conventional first-order bi-level schemes (under the LLS simplification) based on our proof recipe. Extensive experiments justify our theoretical results and demonstrate the superiority of the proposed BDA for different tasks, including hyper-parameter optimization and meta learning.
Cite
Text
Liu et al. "A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton." International Conference on Machine Learning, 2020.Markdown
[Liu et al. "A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/liu2020icml-generic/)BibTeX
@inproceedings{liu2020icml-generic,
title = {{A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton}},
author = {Liu, Risheng and Mu, Pan and Yuan, Xiaoming and Zeng, Shangzhi and Zhang, Jin},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {6305-6315},
volume = {119},
url = {https://mlanthology.org/icml/2020/liu2020icml-generic/}
}