A Stronger Mixture of Low-Rank Experts for Fine-Tuning Foundation Models
Abstract
In order to streamline the fine-tuning of foundation models, Low-Rank Adapters (LoRAs) have been substantially adopted across various fields, including instruction tuning and domain adaptation. The underlying concept of LoRA involves decomposing a full-rank matrix into the product of two lower-rank matrices, which reduces storage consumption and accelerates the training process. Furthermore, to address the limited expressive capacity of LoRA, the Mixture-of-Expert (MoE) has been introduced for incorporating multiple LoRA adapters. The integration of LoRA experts leads to a visible improvement across several downstream scenes. However, the mixture of LoRAs (MoE-LoRA) still exhibits its low robustness during tuning and inferring. Inspired by the Riemannian Preconditioners which train LoRA as a sub-space projector, we propose a new training strategy for MoE-LoRA, to stabilize and boost its feature learning by gate-rescaled multi-space projections. We provide both a theoretical solution as well as an alternative engineering strategy. Examinations on SGD and AdamW optimizers demonstrate the effectiveness of our methodology. Source code is available at https://github.com/THUDM/MoELoRA_Riemannian.
Cite
Text
Sun et al. "A Stronger Mixture of Low-Rank Experts for Fine-Tuning Foundation Models." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Sun et al. "A Stronger Mixture of Low-Rank Experts for Fine-Tuning Foundation Models." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/sun2025icml-stronger/)BibTeX
@inproceedings{sun2025icml-stronger,
title = {{A Stronger Mixture of Low-Rank Experts for Fine-Tuning Foundation Models}},
author = {Sun, Mengyang and Wang, Yihao and Feng, Tao and Zhang, Dan and Zhu, Yifan and Tang, Jie},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {57712-57727},
volume = {267},
url = {https://mlanthology.org/icml/2025/sun2025icml-stronger/}
}