dEBORA: Efficient Bilevel Optimization-Based Low-Rank Adaptation
Abstract
Low-rank adaptation methods are a popular approach for parameter-efficient fine-tuning of large-scale neural networks. However, selecting the optimal rank for each layer remains a challenging problem that significantly affects both performance and efficiency. In this paper, we introduce a novel bilevel optimization strategy that simultaneously trains both matrix and tensor low-rank adapters, dynamically selecting the optimal rank for each layer. Our method avoids the use of implicit differentiation in the computation of the hypergradient, and integrates a stochastic away-step variant of the Frank-Wolfe algorithm, eliminating the need for projection and providing identifiability guarantees of the optimal rank structure. This results in a highly efficient and cost-effective training scheme that adaptively allocates the parameter budget across the network layers. On top of a detailed theoretical analysis of the method, we provide different numerical experiments showcasing its effectiveness.
Cite
Text
Zangrando et al. "dEBORA: Efficient Bilevel Optimization-Based Low-Rank Adaptation." International Conference on Learning Representations, 2025.Markdown
[Zangrando et al. "dEBORA: Efficient Bilevel Optimization-Based Low-Rank Adaptation." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/zangrando2025iclr-debora/)BibTeX
@inproceedings{zangrando2025iclr-debora,
title = {{dEBORA: Efficient Bilevel Optimization-Based Low-Rank Adaptation}},
author = {Zangrando, Emanuele and Venturini, Sara and Rinaldi, Francesco and Tudisco, Francesco},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/zangrando2025iclr-debora/}
}