Iterative Importance Fine-Tuning of Diffusion Models
Abstract
Diffusion models are an important tool for generative modelling, serving as effective priors in applications such as imaging and protein design. A key challenge in applying diffusion models for downstream tasks is efficiently sampling from resulting posterior distributions, which can be addressed using the $h$-transform. This work introduces a self-supervised algorithm for fine-tuning diffusion models by estimating the $h$-transform, enabling amortised conditional sampling. Our method iteratively refines the $h$-transform using a synthetic dataset resampled with path-based importance weights. We demonstrate the effectiveness of this framework on class-conditional sampling and reward fine-tuning for text-to-image diffusion models.
Cite
Text
Denker et al. "Iterative Importance Fine-Tuning of Diffusion Models." ICLR 2025 Workshops: FPI, 2025.Markdown
[Denker et al. "Iterative Importance Fine-Tuning of Diffusion Models." ICLR 2025 Workshops: FPI, 2025.](https://mlanthology.org/iclrw/2025/denker2025iclrw-iterative/)BibTeX
@inproceedings{denker2025iclrw-iterative,
title = {{Iterative Importance Fine-Tuning of Diffusion Models}},
author = {Denker, Alexander and Padhy, Shreyas and Vargas, Francisco and Hertrich, Johannes},
booktitle = {ICLR 2025 Workshops: FPI},
year = {2025},
url = {https://mlanthology.org/iclrw/2025/denker2025iclrw-iterative/}
}