HALO: Hadamard-Assisted Lower-Precision Optimization for LLMs

Abstract

Quantized training of Large Language Models (LLMs) remains an open challenge, as maintaining accuracy while performing all matrix multiplications in low precision has proven difficult. This is particularly the case when fine-tuning pre-trained models, which can have large weight, activation, and error (output gradient) outlier values that make lower-precision optimization difficult. To address this, we present HALO, a new quantization-aware training approach for Transformers that enables accurate and efficient low-precision training by combining 1) strategic placement of Hadamard rotations in both forward and backward passes, which mitigate outliers, 2) high-performance kernel support, and 3) FSDP integration for low-precision communication. Our approach ensures that all large matrix multiplications during the forward and backward passes are executed in lower precision. Applied to LLaMa models, HALO achieves near-full-precision-equivalent results during fine-tuning on various tasks, while delivering up to 1.41x end-to-end speedup for full fine-tuning on RTX 4090 GPUs. HALO efficiently supports both standard and parameter-efficient fine-tuning (PEFT). Our results demonstrate the first practical approach to fully quantized LLM fine-tuning that maintains accuracy in INT8 and FP6 precision, while delivering performance benefits.

Cite

Text

Ashkboos et al. "HALO: Hadamard-Assisted Lower-Precision Optimization for LLMs." Advances in Neural Information Processing Systems, 2025.

Markdown

[Ashkboos et al. "HALO: Hadamard-Assisted Lower-Precision Optimization for LLMs." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/ashkboos2025neurips-halo/)

BibTeX

@inproceedings{ashkboos2025neurips-halo,
  title     = {{HALO: Hadamard-Assisted Lower-Precision Optimization for LLMs}},
  author    = {Ashkboos, Saleh and Nikdan, Mahdi and Tabesh, Soroush and Castro, Roberto L. and Hoefler, Torsten and Alistarh, Dan},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/ashkboos2025neurips-halo/}
}