Towards Scalable Compression with Universally Quantized Diffusion Models

Abstract

Diffusion probabilistic models have achieved success in many generative modeling tasks, from image generation to inverse problem solving. A distinct feature of these models is that they correspond to deep hierarchical latent variable models optimizing a variational evidence lower bound (ELBO) on the data likelihood. Drawing on a basic connection between likelihood modeling and compression, we explore the potential of diffusion models for progressive coding, resulting in a sequence of bits that can be incrementally transmitted and decoded with progressively improving reconstruction quality. Unlike prior work based on Gaussian diffusion or conditional diffusion models, we propose a new form of diffusion model with uniform noise in the forward process, whose negative ELBO corresponds to the end-to-end compression cost using universal quantization. We obtain promising first results on image compression, achieving competitive rate-distortion and rate-realism results on a wide range of bit-rates with a single model.

Cite

Text

Yang et al. "Towards Scalable Compression with Universally Quantized Diffusion Models." NeurIPS 2024 Workshops: Compression, 2024.

Markdown

[Yang et al. "Towards Scalable Compression with Universally Quantized Diffusion Models." NeurIPS 2024 Workshops: Compression, 2024.](https://mlanthology.org/neuripsw/2024/yang2024neuripsw-scalable/)

BibTeX

@inproceedings{yang2024neuripsw-scalable,
  title     = {{Towards Scalable Compression with Universally Quantized Diffusion Models}},
  author    = {Yang, Yibo and Will, Justus and Mandt, Stephan},
  booktitle = {NeurIPS 2024 Workshops: Compression},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/yang2024neuripsw-scalable/}
}