BoA: Attention-Aware Post-Training Quantization Without Backpropagation
Abstract
Post-training quantization (PTQ) is a promising solution for deploying large language models (LLMs) on resource-constrained devices. Early methods developed for small-scale networks, such as ResNet, rely on gradient-based optimization, which becomes impractical for hyper-scale LLMs with billions of parameters. While recently proposed backpropagation-free or transformation-based methods alleviate this issue, they ignore inter-layer interactions or use the naive nearest-rounding-based quantized weight assignment to save the heavy computational cost of weight optimization. In this paper, we introduce a novel backpropagation-free PTQ algorithm that optimizes quantized weights by considering inter-layer dependencies. The key innovation is the development of attention-aware Hessian matrices that capture inter-layer interactions within the attention module. Extensive experiments demonstrate that our approach not only outperforms existing weight quantization methods but also shows good synergy with conventional methods to suppress activation outliers, leading to state-of-the-art weight-activation quantization performance. The code will be available at https://github.com/SamsungLabs/BoA.
Cite
Text
Kim et al. "BoA: Attention-Aware Post-Training Quantization Without Backpropagation." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Kim et al. "BoA: Attention-Aware Post-Training Quantization Without Backpropagation." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/kim2025icml-boa/)BibTeX
@inproceedings{kim2025icml-boa,
title = {{BoA: Attention-Aware Post-Training Quantization Without Backpropagation}},
author = {Kim, Junhan and Kim, Ho-Young and Cho, Eulrang and Lee, Chungman and Kim, Joonyoung and Jeon, Yongkweon},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {30132-30152},
volume = {267},
url = {https://mlanthology.org/icml/2025/kim2025icml-boa/}
}