Importance Matching Lemma for Lossy Compression with Side Information
Abstract
We propose two extensions to existing importance sampling based methods for lossy compression. First, we introduce an importance sampling based compression scheme that is a variant of ordered random coding (Theis and Ahmed, 2022) and is amenable to direct evaluation of the achievable compression rate for a finite number of samples. Our second and major contribution is the \emph{importance matching lemma}, which is a finite proposal counterpart of the recently introduced Poisson matching lemma (Li and Anantharam, 2021). By integrating with deep learning, we provide a new coding scheme for distributed lossy compression with side information at the decoder. We demonstrate the effectiveness of the proposed scheme through experiments involving synthetic Gaussian sources, distributed image compression with MNIST and vertical federated learning with CIFAR-10.
Cite
Text
Phan et al. "Importance Matching Lemma for Lossy Compression with Side Information." Artificial Intelligence and Statistics, 2024.Markdown
[Phan et al. "Importance Matching Lemma for Lossy Compression with Side Information." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/phan2024aistats-importance/)BibTeX
@inproceedings{phan2024aistats-importance,
title = {{Importance Matching Lemma for Lossy Compression with Side Information}},
author = {Phan, Buu and Khisti, Ashish and Louizos, Christos},
booktitle = {Artificial Intelligence and Statistics},
year = {2024},
pages = {1387-1395},
volume = {238},
url = {https://mlanthology.org/aistats/2024/phan2024aistats-importance/}
}