Lower-Level Duality Based Reformulation and Majorization Minimization Algorithm for Hyperparameter Optimization
Abstract
Hyperparameter tuning is an important task of machine learning, which can be formulated as a bilevel program (BLP). However, most existing algorithms are not applicable for BLP with non-smooth lower-level problems. To address this, we propose a single-level reformulation of the BLP based on lower-level duality without involving any implicit value function. To solve the reformulation, we propose a majorization minimization algorithm that marjorizes the constraint in each iteration. Furthermore, we show that the subproblems of the proposed algorithm for several widely-used hyperparameter turning models can be reformulated into conic programs that can be efficiently solved by the off-the-shelf solvers. We theoretically prove the convergence of the proposed algorithm and demonstrate its superiority through numerical experiments.
Cite
Text
Chen et al. "Lower-Level Duality Based Reformulation and Majorization Minimization Algorithm for Hyperparameter Optimization." Artificial Intelligence and Statistics, 2024.Markdown
[Chen et al. "Lower-Level Duality Based Reformulation and Majorization Minimization Algorithm for Hyperparameter Optimization." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/chen2024aistats-lowerlevel/)BibTeX
@inproceedings{chen2024aistats-lowerlevel,
title = {{Lower-Level Duality Based Reformulation and Majorization Minimization Algorithm for Hyperparameter Optimization}},
author = {Chen, He and Xu, Haochen and Jiang, Rujun and Man-Cho So, Anthony},
booktitle = {Artificial Intelligence and Statistics},
year = {2024},
pages = {784-792},
volume = {238},
url = {https://mlanthology.org/aistats/2024/chen2024aistats-lowerlevel/}
}