Synergistic Self-Supervised and Quantization Learning
Abstract
With the success of self-supervised learning (SSL), it has become a mainstream paradigm to fine-tune from self-supervised pretrained models to boost the performance on downstream tasks. However, we find that current SSL models suffer severe accuracy drops when performing low-bit quantization, prohibiting their deployment in resource-constrained applications. In this paper, we propose a method called synergistic self-supervised and quantization learning (SSQL) to pretrain quantization-friendly self-supervised models facilitating downstream deployment. SSQL contrasts the features of the quantized and full precision models in a self-supervised fashion, where the bit-width for the quantized model is randomly selected in each step. SSQL not only significantly improves the accuracy when quantized to lower bit-widths, but also boosts the accuracy of full precision models in most cases. By only training once, SSQL can then benefit various downstream tasks at different bit-widths simultaneously. Moreover, the bit-width flexibility is achieved without additional storage overhead, requiring only one copy of weights during training and inference. We theoretically analyze the optimization process of SSQL, and conduct exhaustive experiments on various benchmarks to further demonstrate the effectiveness of our method.
Cite
Text
Cao et al. "Synergistic Self-Supervised and Quantization Learning." Proceedings of the European Conference on Computer Vision (ECCV), 2022. doi:10.1007/978-3-031-20056-4_34Markdown
[Cao et al. "Synergistic Self-Supervised and Quantization Learning." Proceedings of the European Conference on Computer Vision (ECCV), 2022.](https://mlanthology.org/eccv/2022/cao2022eccv-synergistic/) doi:10.1007/978-3-031-20056-4_34BibTeX
@inproceedings{cao2022eccv-synergistic,
title = {{Synergistic Self-Supervised and Quantization Learning}},
author = {Cao, Yun-Hao and Sun, Peiqin and Huang, Yechang and Wu, Jianxin and Zhou, Shuchang},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2022},
doi = {10.1007/978-3-031-20056-4_34},
url = {https://mlanthology.org/eccv/2022/cao2022eccv-synergistic/}
}