Federated Residual Low-Rank Adaptation of Large Language Models

Abstract

Low-Rank Adaptation (LoRA) presents an effective solution for federated fine-tuning of Large Language Models (LLMs), as it substantially reduces communication overhead. However, a straightforward combination of FedAvg and LoRA results in suboptimal performance, especially under data heterogeneity. We noted this stems from both intrinsic (i.e., constrained parameter space) and extrinsic (i.e., client drift) limitations, which hinder it effectively learn global knowledge. In this work, we proposed a novel Federated Residual Low-Rank Adaption method, namely FRLoRA, to tackle above two limitations. It directly sums the weight of the global model parameters with a residual low-rank matrix product (\ie, weight change) during the global update step, and synchronizes this update for all local models. By this, FRLoRA performs global updates in a higher-rank parameter space, enabling a better representation of complex knowledge structure. Furthermore, FRLoRA reinitializes the local low-rank matrices with the principal singular values and vectors of the pre-trained weights in each round, to calibrate their inconsistent convergence, thereby mitigating client drift. Our extensive experiments demonstrate that FRLoRA consistently outperforms various state-of-the-art FL methods across nine different benchmarks in natural language understanding and generation under different FL scenarios.

Cite

Text

Yan et al. "Federated Residual Low-Rank Adaptation of Large Language Models." International Conference on Learning Representations, 2025.

Markdown

[Yan et al. "Federated Residual Low-Rank Adaptation of Large Language Models." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/yan2025iclr-federated/)

BibTeX

@inproceedings{yan2025iclr-federated,
  title     = {{Federated Residual Low-Rank Adaptation of Large Language Models}},
  author    = {Yan, Yunlu and Feng, Chun-Mei and Zuo, Wangmeng and Goh, Rick Siow Mong and Liu, Yong and Zhu, Lei},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/yan2025iclr-federated/}
}