Dataset Distillation as Data Compression: A Rate-Utility Perspective

Abstract

Driven by the "scale-is-everything" paradigm, modern machine learning increasingly demands ever-larger datasets and models, yielding prohibitive computational and storage requirements. Dataset distillation mitigates this by compressing an original dataset into a small set of synthetic samples, while preserving its full utility. Yet, existing methods either maximize performance under fixed storage budgets or pursue suitable synthetic data representations for redundancy removal, without jointly optimizing both objectives. In this work, we propose a joint rate-utility optimization method for dataset distillation. We parameterize synthetic samples as optimizable latent codes decoded by extremely lightweight networks. We estimate the Shannon entropy of quantized latents as the rate measure and plug any existing distillation loss as the utility measure, trading them off via a Lagrange multiplier. To enable fair, cross-method comparisons, we introduce bits per class (bpc), a precise storage metric that accounts for sample, label, and decoder parameter costs. On CIFAR-10, CIFAR-100, and ImageNet-128, our method achieves up to 170xgreater compression than standard distillation at comparable accuracy. Across diverse bpc budgets, distillation losses, and backbone architectures, our approach consistently establishes better rate-utility trade-offs.

Cite

Text

Bao et al. "Dataset Distillation as Data Compression: A Rate-Utility Perspective." International Conference on Computer Vision, 2025.

Markdown

[Bao et al. "Dataset Distillation as Data Compression: A Rate-Utility Perspective." International Conference on Computer Vision, 2025.](https://mlanthology.org/iccv/2025/bao2025iccv-dataset/)

BibTeX

@inproceedings{bao2025iccv-dataset,
  title     = {{Dataset Distillation as Data Compression: A Rate-Utility Perspective}},
  author    = {Bao, Youneng and Liu, Yiping and Chen, Zhuo and Liang, Yongsheng and Li, Mu and Ma, Kede},
  booktitle = {International Conference on Computer Vision},
  year      = {2025},
  pages     = {519-529},
  url       = {https://mlanthology.org/iccv/2025/bao2025iccv-dataset/}
}