Towards Better & Faster Autoregressive Image Generation: From the Perspective of Entropy

Abstract

In this work, we first revisit the sampling issues in current autoregressive (AR) image generation models and identify that image tokens, unlike text tokens, exhibit lower information density and non-uniform spatial distribution. Accordingly, we present an entropy-informed decoding strategy that facilitates higher autoregressive generation quality with faster synthesis speed. Specifically, the proposed method introduces two main innovations: 1) dynamic temperature control guided by spatial entropy of token distributions, enhancing the balance between content diversity, alignment accuracy, and structural coherence in both mask-based and scale-wise models, without extra computational overhead, and 2) entropy-aware acceptance rules in speculative decoding, achieving near-lossless generation at about 85% of the inference cost of conventional acceleration methods. Extensive experiments across multiple benchmarks using diverse AR image generation models demonstrate the effectiveness and generalizability of our approach in enhancing both generation quality and sampling speed.

Cite

Text

Ma et al. "Towards Better & Faster Autoregressive Image Generation: From the Perspective of Entropy." Advances in Neural Information Processing Systems, 2025.

Markdown

[Ma et al. "Towards Better & Faster Autoregressive Image Generation: From the Perspective of Entropy." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/ma2025neurips-better/)

BibTeX

@inproceedings{ma2025neurips-better,
  title     = {{Towards Better & Faster Autoregressive Image Generation: From the Perspective of Entropy}},
  author    = {Ma, Xiaoxiao and Zhao, Feng and Ling, Pengyang and Qiu, Haibo and Wei, Zhixiang and Yu, Hu and Huang, Jie and Zeng, Zhixiong and Ma, Lin},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/ma2025neurips-better/}
}