Stabilizing the Training of Consistency Models with Score Guidance

Abstract

Consistency models exhibit superior sample quality with few steps of sampling, even without relying on pre-trained teacher diffusion models. However, as the number of total discretization steps increases, they suffer from unstable training due to large variance which leads to suboptimal performance. It is known that this can be mitigated by initializing their weights with pre-trained diffusion models, which suggests the potential effectiveness of adopting diffusion models to solve the problem. Inspired by this, we introduce a transformation layer termed score head, which is trained in conjunction with consistency model to form a larger diffusion model. Additionally updating consistency model with gradients coming from score head reduces variance during training. We also observe that this joint training scheme aids consistency model to learn common low-level features acquired by diffusion model. The sample quality improves accordingly when measured on CIFAR-10.

Cite

Text

Lee et al. "Stabilizing the Training of Consistency Models with Score Guidance." ICML 2024 Workshops: SPIGM, 2024.

Markdown

[Lee et al. "Stabilizing the Training of Consistency Models with Score Guidance." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/lee2024icmlw-stabilizing/)

BibTeX

@inproceedings{lee2024icmlw-stabilizing,
  title     = {{Stabilizing the Training of Consistency Models with Score Guidance}},
  author    = {Lee, Jeongjun and Park, Jonggeon and Yoon, Jongmin and Lee, Juho},
  booktitle = {ICML 2024 Workshops: SPIGM},
  year      = {2024},
  url       = {https://mlanthology.org/icmlw/2024/lee2024icmlw-stabilizing/}
}