Learned Imaging with Constraints and Uncertainty Quantification
Abstract
We outline new approaches to incorporate ideas from deep learning into wave-based least-squares imaging. The aim, and main contribution of this work, is the combination of handcrafted constraints with deep convolutional neural networks, as a way to harness their remarkable ease of generating natural images. The mathematical basis underlying our method is the expectation-maximization framework, where data are divided in batches and coupled to additional "latent" unknowns. These unknowns are pairs of elements from the original unknown space (but now coupled to a specific data batch) and network inputs. In this setting, the neural network controls the similarity between these additional parameters, acting as a "center" variable. The resulting problem amounts to a maximum-likelihood estimation of the network parameters when the augmented data model is marginalized over the latent variables.
Cite
Text
Herrmann et al. "Learned Imaging with Constraints and Uncertainty Quantification." NeurIPS 2019 Workshops: Deep_Inverse, 2019.Markdown
[Herrmann et al. "Learned Imaging with Constraints and Uncertainty Quantification." NeurIPS 2019 Workshops: Deep_Inverse, 2019.](https://mlanthology.org/neuripsw/2019/herrmann2019neuripsw-learned/)BibTeX
@inproceedings{herrmann2019neuripsw-learned,
title = {{Learned Imaging with Constraints and Uncertainty Quantification}},
author = {Herrmann, Felix J. and Siahkoohi, Ali and Rizzuti, Gabrio},
booktitle = {NeurIPS 2019 Workshops: Deep_Inverse},
year = {2019},
url = {https://mlanthology.org/neuripsw/2019/herrmann2019neuripsw-learned/}
}