Solving Inverse Problems with Latent Diffusion Models via Hard Data Consistency

Abstract

Latent diffusion models have been demonstrated to generate high-quality images, while offering efficiency in model training compared to diffusion models operating in the pixel space. However, incorporating latent diffusion models to solve inverse problems remains a challenging problem due to the nonlinearity of the encoder and decoder. To address these issues, we propose ReSample, an algorithm that can solve general inverse problems with pre-trained latent diffusion models. Our algorithm incorporates data consistency by solving an optimization problem during the reverse sampling process, a concept that we term as hard data consistency. Upon solving this optimization problem, we propose a novel resampling scheme to map the measurement-consistent sample back onto the noisy data manifold and theoretically demonstrate its benefits. Lastly, we apply our algorithm to solve a wide range of linear and nonlinear inverse problems in both natural and medical images, demonstrating that our approach outperforms existing state-of-the-art approaches, including those based on pixel-space diffusion models.

Cite

Text

Song et al. "Solving Inverse Problems with Latent Diffusion Models via Hard Data Consistency." International Conference on Learning Representations, 2024.

Markdown

[Song et al. "Solving Inverse Problems with Latent Diffusion Models via Hard Data Consistency." International Conference on Learning Representations, 2024.](https://mlanthology.org/iclr/2024/song2024iclr-solving/)

BibTeX

@inproceedings{song2024iclr-solving,
  title     = {{Solving Inverse Problems with Latent Diffusion Models via Hard Data Consistency}},
  author    = {Song, Bowen and Kwon, Soo Min and Zhang, Zecheng and Hu, Xinyu and Qu, Qing and Shen, Liyue},
  booktitle = {International Conference on Learning Representations},
  year      = {2024},
  url       = {https://mlanthology.org/iclr/2024/song2024iclr-solving/}
}