MLIC$^{++}$: Linear Complexity Multi-Reference Entropy Modeling for Learned Image Compression
Abstract
Recently, multi-reference entropy model has been proposed, which captures channel-wise, local spatial, and global spatial correlations. Previous works adopt attention for global correlation capturing, however, the quadratic cpmplexity limits the potential of high-resolution image coding. In this paper, we propose the linear complexity global correlations capturing, via the decomposition of softmax operation. Based on it, we propose the MLIC$^{++}$, a learned image compression with linear complexity for multi-reference entropy modeling. Our MLIC$^{++}$ is more efficient and it reduces BD-rate by $12.44$% on the Kodak dataset compared to VTM-17.0 when measured in PSNR.
Cite
Text
Jiang and Wang. "MLIC$^{++}$: Linear Complexity Multi-Reference Entropy Modeling for Learned Image Compression." ICML 2023 Workshops: NCW, 2023.Markdown
[Jiang and Wang. "MLIC$^{++}$: Linear Complexity Multi-Reference Entropy Modeling for Learned Image Compression." ICML 2023 Workshops: NCW, 2023.](https://mlanthology.org/icmlw/2023/jiang2023icmlw-mlic/)BibTeX
@inproceedings{jiang2023icmlw-mlic,
title = {{MLIC$^{++}$: Linear Complexity Multi-Reference Entropy Modeling for Learned Image Compression}},
author = {Jiang, Wei and Wang, Ronggang},
booktitle = {ICML 2023 Workshops: NCW},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/jiang2023icmlw-mlic/}
}