SVFT: Parameter-Efficient Fine-Tuning with Singular Vectors

Abstract

Popular parameter-efficient fine-tuning (PEFT) methods, such as LoRA and its variants, freeze pre-trained model weights (\mathbf{W}) and inject learnable matrices (\mathbf{\Delta W}). These (\mathbf{\Delta W}) matrices are structured for efficient parameterization, often using techniques like low-rank approximations or scaling vectors. However, these methods typically show a performance gap compared to full fine-tuning. Although recent PEFT methods have narrowed this gap, they do so at the cost of additional learnable parameters. We propose SVFT, a simple approach that fundamentally differs from existing methods: the structure imposed on (\mathbf{\Delta W}) depends on the specific weight matrix (\mathbf{W}). Specifically, SVFT updates (\mathbf{W}) as a sparse combination of outer products of its singular vectors, training only the coefficients (scales) of these sparse combinations. This approach allows fine-grained control over expressivity through the number of coefficients. Extensive experiments on language and vision benchmarks show that SVFT recovers up to \textbf{96%} of full fine-tuning performance while training only \textbf{0.006 to 0.25%} of parameters, outperforming existing methods that only recover up to \textbf{85%} performance using \textbf{0.03 to 0.8%} of the trainable parameter budget.

Cite

Text

Lingam et al. "SVFT: Parameter-Efficient Fine-Tuning with Singular Vectors." ICML 2024 Workshops: ES-FoMo-II, 2024.

Markdown

[Lingam et al. "SVFT: Parameter-Efficient Fine-Tuning with Singular Vectors." ICML 2024 Workshops: ES-FoMo-II, 2024.](https://mlanthology.org/icmlw/2024/lingam2024icmlw-svft/)

BibTeX

@inproceedings{lingam2024icmlw-svft,
  title     = {{SVFT: Parameter-Efficient Fine-Tuning with Singular Vectors}},
  author    = {Lingam, Vijay and Neerkaje, Atula Tejaswi and Vavre, Aditya and Shetty, Aneesh and Gudur, Gautham Krishna and Ghosh, Joydeep and Dimakis, Alex and Choi, Eunsol and Bojchevski, Aleksandar and Sanghavi, Sujay},
  booktitle = {ICML 2024 Workshops: ES-FoMo-II},
  year      = {2024},
  url       = {https://mlanthology.org/icmlw/2024/lingam2024icmlw-svft/}
}