Analysis of Classifier-Free Guidance Weight Schedulers
Abstract
Classifier-Free Guidance (CFG) enhances the quality and condition adherence of text-to-image diffusion models. It operates by combining the conditional and unconditional predictions using a fixed weight. However, recent works vary the weights throughout the diffusion process, reporting superior results but without providing any rationale or analysis. By conducting comprehensive experiments, this paper provides insights into CFG weight schedulers. Our findings suggest that simple, monotonically increasing weight schedulers consistently lead to improved performances, requiring merely a single line of code. In addition, more complex parametrized schedulers can be optimized for further improvement, but do not generalize across different models and tasks.
Cite
Text
Wang et al. "Analysis of Classifier-Free Guidance Weight Schedulers." Transactions on Machine Learning Research, 2024.Markdown
[Wang et al. "Analysis of Classifier-Free Guidance Weight Schedulers." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/wang2024tmlr-analysis/)BibTeX
@article{wang2024tmlr-analysis,
title = {{Analysis of Classifier-Free Guidance Weight Schedulers}},
author = {Wang, Xi and Dufour, Nicolas and Andreou, Nefeli and Cani, Marie-Paule and Abrevaya, Victoria Fernandez and Picard, David and Kalogeiton, Vicky},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/wang2024tmlr-analysis/}
}