PRISM: Diversifying Dataset Distillation by Decoupling Architectural Priors
Abstract
Dataset distillation (DD) promises compact yet faithful synthetic data, but existing approaches often inherit the inductive bias of a single teacher model. As dataset size increases, this bias drives generation toward overly smooth, homogeneous samples, reducing intra-class diversity and limiting generalization. We present PRISM (PRIors from diverse Source Models), a framework that disentangles architectural priors during synthesis. PRISM decouples the logit-matching and regularization objectives, supervising them with different teacher architectures: a primary model for logits and a stochastic subset for batch-normalization (BN) alignment. On ImageNet-1K, PRISM consistently and reproducibly outperforms single-teacher methods (e.g., SRe2L) and recent multi-teacher variants (e.g., G-VBSM) at low- and mid-IPC regimes. The generated data also show significantly richer intra-class diversity, as reflected by a notable drop in cosine similarity between features. We further analyze teacher selection strategies (pre- vs. intra-distillation) and introduce a scalable cross-class batch formation scheme for fast parallel synthesis. Code: https://github.com/Brian-Moser/prism
Cite
Text
Moser et al. "PRISM: Diversifying Dataset Distillation by Decoupling Architectural Priors." Transactions on Machine Learning Research, 2026.Markdown
[Moser et al. "PRISM: Diversifying Dataset Distillation by Decoupling Architectural Priors." Transactions on Machine Learning Research, 2026.](https://mlanthology.org/tmlr/2026/moser2026tmlr-prism/)BibTeX
@article{moser2026tmlr-prism,
title = {{PRISM: Diversifying Dataset Distillation by Decoupling Architectural Priors}},
author = {Moser, Brian Bernhard and Sarode, Shalini and Raue, Federico and Frolov, Stanislav and Adamkiewicz, Krzysztof and Shanbhag, Arundhati and Folz, Joachim and Nauen, Tobias Christian and Dengel, Andreas},
journal = {Transactions on Machine Learning Research},
year = {2026},
url = {https://mlanthology.org/tmlr/2026/moser2026tmlr-prism/}
}