Compressed Sensing with Invertible Generative Models and Dependent Noise

Abstract

We study image inverse problems with invertible generative priors, specifically normalizing flow models. Our formulation views the solution as the maximum a posteriori (MAP) estimate of the image given the measurements. Our general formulation allows for any differentiable noise model with long-range dependencies as well as non-linear differentiable forward operators. We establish theoretical recovery guarantees for denoising and compressed sensing under our framework. We also empirically validate our method on various inverse problems including 1-bit compressed sensing and denoising with highly structured noise patterns.

Cite

Text

Whang et al. "Compressed Sensing with Invertible Generative Models and Dependent Noise." NeurIPS 2020 Workshops: Deep_Inverse, 2020.

Markdown

[Whang et al. "Compressed Sensing with Invertible Generative Models and Dependent Noise." NeurIPS 2020 Workshops: Deep_Inverse, 2020.](https://mlanthology.org/neuripsw/2020/whang2020neuripsw-compressed/)

BibTeX

@inproceedings{whang2020neuripsw-compressed,
  title     = {{Compressed Sensing with Invertible Generative Models and Dependent Noise}},
  author    = {Whang, Jay and Lei, Qi and Dimakis, Alex},
  booktitle = {NeurIPS 2020 Workshops: Deep_Inverse},
  year      = {2020},
  url       = {https://mlanthology.org/neuripsw/2020/whang2020neuripsw-compressed/}
}