PoLAR: Polar-Decomposed Low-Rank Adapter Representation

Abstract

We show that low-rank adaptation of large-scale models suffers from a low stable rank that is well below the linear algebraic rank of the subspace, degrading fine-tuning performance. To mitigate the underutilization of the allocated subspace, we propose PoLAR, a parameterization inspired by the polar decomposition that factorizes the low-rank update into two direction matrices constrained to Stiefel manifolds and an unconstrained scale matrix. Our theory shows that PoLAR yields an exponentially faster convergence rate on a canonical low-rank adaptation problem. Pairing the parameterization with Riemannian optimization leads to consistent gains on three different benchmarks testing general language understanding, commonsense reasoning, and mathematical problem solving with base model sizes ranging from 350M to 27B.

Cite

Text

Lion et al. "PoLAR: Polar-Decomposed Low-Rank Adapter Representation." Advances in Neural Information Processing Systems, 2025.

Markdown

[Lion et al. "PoLAR: Polar-Decomposed Low-Rank Adapter Representation." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/lion2025neurips-polar/)

BibTeX

@inproceedings{lion2025neurips-polar,
  title     = {{PoLAR: Polar-Decomposed Low-Rank Adapter Representation}},
  author    = {Lion, Kai and Zhang, Liang and Li, Bingcong and He, Niao},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/lion2025neurips-polar/}
}