Pseudoinverse-Guided Diffusion Models for Inverse Problems
Abstract
Diffusion models have become competitive candidates for solving various inverse problems. Models trained for specific inverse problems work well but are limited to their particular use cases, whereas methods that use problem-agnostic models are general but often perform worse empirically. To address this dilemma, we introduce Pseudoinverse-guided Diffusion Models ($\Pi$GDM), an approach that uses problem-agnostic models to close the gap in performance. $\Pi$GDM directly estimates conditional scores from the measurement model of the inverse problem without additional training. It can address inverse problems with noisy, non-linear, or even non-differentiable measurements, in contrast to many existing approaches that are limited to noiseless linear ones. We illustrate the empirical effectiveness of $\Pi$GDM on several image restoration tasks, including super-resolution, inpainting and JPEG restoration. On ImageNet, $\Pi$GDM is competitive with state-of-the-art diffusion models trained on specific tasks, and is the first to achieve this with problem-agnostic diffusion models. $\Pi$GDM can also solve a wider set of inverse problems where the measurement processes are composed of several simpler ones.
Cite
Text
Song et al. "Pseudoinverse-Guided Diffusion Models for Inverse Problems." International Conference on Learning Representations, 2023.Markdown
[Song et al. "Pseudoinverse-Guided Diffusion Models for Inverse Problems." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/song2023iclr-pseudoinverseguided/)BibTeX
@inproceedings{song2023iclr-pseudoinverseguided,
title = {{Pseudoinverse-Guided Diffusion Models for Inverse Problems}},
author = {Song, Jiaming and Vahdat, Arash and Mardani, Morteza and Kautz, Jan},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/song2023iclr-pseudoinverseguided/}
}