Fast Relative Entropy Coding with A* Coding
Abstract
Relative entropy coding (REC) algorithms encode a sample from a target distribution Q using a proposal distribution P, such that the expected codelength is O(KL[Q || P]). REC can be seamlessly integrated with existing learned compression models since, unlike entropy coding, it does not assume discrete Q or P, and does not require quantisation. However, general REC algorithms require an intractable $\Omega$(exp(KL[Q || P])) runtime. We introduce AS* and AD* coding, two REC algorithms based on A* sampling. We prove that, for continuous distributions over the reals, if the density ratio is unimodal, AS* has O(D$\infty$[Q || P]) expected runtime, where D$\infty$[Q || P] is the Renyi $\infty$-divergence. We provide experimental evidence that AD* also has O(D$\infty$[Q || P]) expected runtime. We prove that AS* and AD* achieve an expected codelength of O(KL[Q || P]). Further, we introduce DAD*, an approximate algorithm based on AD* which retains its favourable runtime and has bias similar to that of alternative methods. Focusing on VAEs, we propose the IsoKL VAE (IKVAE), which can be used with DAD* to further improve compression efficiency. We evaluate A* coding with (IK)VAEs on MNIST, showing that it can losslessly compress images near the theoretically optimal limit.
Cite
Text
Flamich et al. "Fast Relative Entropy Coding with A* Coding." International Conference on Machine Learning, 2022.Markdown
[Flamich et al. "Fast Relative Entropy Coding with A* Coding." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/flamich2022icml-fast/)BibTeX
@inproceedings{flamich2022icml-fast,
title = {{Fast Relative Entropy Coding with A* Coding}},
author = {Flamich, Gergely and Markou, Stratis and Hernandez-Lobato, Jose Miguel},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {6548-6577},
volume = {162},
url = {https://mlanthology.org/icml/2022/flamich2022icml-fast/}
}