Integer Networks for Data Compression with Latent-Variable Models
Abstract
We consider the problem of using variational latent-variable models for data compression. For such models to produce a compressed binary sequence, which is the universal data representation in a digital world, the latent representation needs to be subjected to entropy coding. Range coding as an entropy coding technique is optimal, but it can fail catastrophically if the computation of the prior differs even slightly between the sending and the receiving side. Unfortunately, this is a common scenario when floating point math is used and the sender and receiver operate on different hardware or software platforms, as numerical round-off is often platform dependent. We propose using integer networks as a universal solution to this problem, and demonstrate that they enable reliable cross-platform encoding and decoding of images using variational models.
Cite
Text
Ballé et al. "Integer Networks for Data Compression with Latent-Variable Models." International Conference on Learning Representations, 2019.Markdown
[Ballé et al. "Integer Networks for Data Compression with Latent-Variable Models." International Conference on Learning Representations, 2019.](https://mlanthology.org/iclr/2019/balle2019iclr-integer/)BibTeX
@inproceedings{balle2019iclr-integer,
title = {{Integer Networks for Data Compression with Latent-Variable Models}},
author = {Ballé, Johannes and Johnston, Nick and Minnen, David},
booktitle = {International Conference on Learning Representations},
year = {2019},
url = {https://mlanthology.org/iclr/2019/balle2019iclr-integer/}
}