DAMix: Exploiting Deep Autoregressive Model Zoo for Improving Lossless Compression Generalization
Abstract
Deep generative models have demonstrated superior performance in lossless compression on identically distributed data. However, in real-world scenarios, data to be compressed are of various distributions and usually cannot be known in advance. Thus, commercially expected neural compression must have strong Out-of-Distribution (OoD) generalization capabilities. Compared with traditional compression methods, deep learning methods have intrinsic flaws for OoD generalization. In this work, we make the attempt to tackle this challenge via exploiting a zoo of Deep Autoregressive models (DAMix). We build a model zoo consisting of autoregressive models trained on data from diverse distributions. In the test phase, we select useful expert models by a simple model evaluation score and adaptively aggregate the predictions of selected models. By assuming the outputs from each expert model are biased in favor of their training distributions, a von Mises-Fisher based filter is proposed to recover the value of unbiased predictions that provides more accurate density estimations than a single model. We derive the posterior of unbiased predictions as well as concentration parameters in the filter, and a novel temporal Stein variational gradient descent for sequential data is proposed to adaptively update the posterior distributions. We evaluate DAMix on 22 image datasets, including in-distribution and OoD data, and demonstrate that making use of unbiased predictions has up to 45.6% improvement over the single model trained on ImageNet.
Cite
Text
Dong et al. "DAMix: Exploiting Deep Autoregressive Model Zoo for Improving Lossless Compression Generalization." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I4.25543Markdown
[Dong et al. "DAMix: Exploiting Deep Autoregressive Model Zoo for Improving Lossless Compression Generalization." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/dong2023aaai-damix/) doi:10.1609/AAAI.V37I4.25543BibTeX
@inproceedings{dong2023aaai-damix,
title = {{DAMix: Exploiting Deep Autoregressive Model Zoo for Improving Lossless Compression Generalization}},
author = {Dong, Qishi and Zhou, Fengwei and Kang, Ning and Xie, Chuanlong and Zhang, Shifeng and Li, Jiawei and Peng, Heng and Li, Zhenguo},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {4250-4258},
doi = {10.1609/AAAI.V37I4.25543},
url = {https://mlanthology.org/aaai/2023/dong2023aaai-damix/}
}