Improving Lossless Compression Rates via Monte Carlo Bits-Back Coding

Abstract

Latent variable models have been successfully applied in lossless compression with the bits-back coding algorithm. However, bits-back suffers from an increase in the bitrate equal to the KL divergence between the approximate posterior and the true posterior. In this paper, we show how to remove this gap asymptotically by deriving bits-back schemes from tighter variational bounds. The key idea is to exploit extended space representations of Monte Carlo estimators of the marginal likelihood. Naively applied, our schemes would require more initial bits than the standard bits-back coder, but we show how to drastically reduce this additional cost with couplings in the latent space. We demonstrate improved lossless compression rates in a variety of settings.

Cite

Text

Ruan et al. "Improving Lossless Compression Rates via Monte Carlo Bits-Back Coding." ICLR 2021 Workshops: Neural_Compression, 2021.

Markdown

[Ruan et al. "Improving Lossless Compression Rates via Monte Carlo Bits-Back Coding." ICLR 2021 Workshops: Neural_Compression, 2021.](https://mlanthology.org/iclrw/2021/ruan2021iclrw-improving/)

BibTeX

@inproceedings{ruan2021iclrw-improving,
  title     = {{Improving Lossless Compression Rates via Monte Carlo Bits-Back Coding}},
  author    = {Ruan, Yangjun and Ullrich, Karen and Severo, Daniel and Townsend, James and Khisti, Ashish J and Doucet, Arnaud and Makhzani, Alireza and Maddison, Chris J.},
  booktitle = {ICLR 2021 Workshops: Neural_Compression},
  year      = {2021},
  url       = {https://mlanthology.org/iclrw/2021/ruan2021iclrw-improving/}
}