Diversity-Rewarded CFG Distillation
Abstract
Generative models are transforming creative domains such as music generation, with inference-time strategies like Classifier-Free Guidance (CFG) playing a crucial role. However, CFG doubles inference cost while limiting originality and diversity across generated contents. In this paper, we introduce diversity-rewarded CFG distillation, a novel finetuning procedure that distills the strengths of CFG while addressing its limitations. Our approach optimises two training objectives: (1) a distillation objective, encouraging the model alone (without CFG) to imitate the CFG-augmented predictions, and (2) an RL objective with a diversity reward, promoting the generation of diverse outputs for a given prompt. By finetuning, we learn model weights with the ability to generate high-quality and diverse outputs, without any inference overhead. This also unlocks the potential of weight-based model merging strategies: by interpolating between the weights of two models (the first focusing on quality, the second on diversity), we can control the quality-diversity trade-off at deployment time, and even further boost performance. We conduct extensive experiments on the MusicLM text-to-music generative model, where our approach surpasses CFG in terms of quality-diversity Pareto optimality. According to human evaluators, our finetuned-then-merged model generates samples with higher quality-diversity than the base model augmented with CFG. Explore our generations at https://musicdiversity.github.io/.
Cite
Text
Cideron et al. "Diversity-Rewarded CFG Distillation." International Conference on Learning Representations, 2025.Markdown
[Cideron et al. "Diversity-Rewarded CFG Distillation." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/cideron2025iclr-diversityrewarded/)BibTeX
@inproceedings{cideron2025iclr-diversityrewarded,
title = {{Diversity-Rewarded CFG Distillation}},
author = {Cideron, Geoffrey and Agostinelli, Andrea and Ferret, Johan and Girgin, Sertan and Elie, Romuald and Bachem, Olivier and Perrin, Sarah and Rame, Alexandre},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/cideron2025iclr-diversityrewarded/}
}