Learning Spectral Regularizations for Linear Inverse Problems

Abstract

One of the main challenges in linear inverse problems is that a majority of such problems are ill-posed in the sense that the solution does not depend on the data continuously. To analyze this effect and reestablish a continuous dependence, classical theory in Hilbert spaces largely relies on the analysis and manipulation of the singular values of the linear operator and its pseudoinverse with the goal of, on the one hand, keeping the singular values of the reconstruction operator bounded, and, on the other hand, approximating the pseudoinverse sufficiently well for a given noise level. While classical regularization methods manipulate the singular values via explicitly defined functions, this paper considers learning such parameter choice rules in such a way, that one obtains higher quality reconstruction results while still remaining in a setting of provably convergent spectral regularization methods. We discuss different ways of parametrizing our spectral regularization methods via neural networks, interpret existing feed forward networks in the setting of spectral regularization which can become provably convergent via an additional projection, and finally demonstrate their superiority in 1d numerical examples.

Cite

Text

Bauermeister et al. "Learning Spectral Regularizations for Linear Inverse Problems." NeurIPS 2020 Workshops: Deep_Inverse, 2020.

Markdown

[Bauermeister et al. "Learning Spectral Regularizations for Linear Inverse Problems." NeurIPS 2020 Workshops: Deep_Inverse, 2020.](https://mlanthology.org/neuripsw/2020/bauermeister2020neuripsw-learning/)

BibTeX

@inproceedings{bauermeister2020neuripsw-learning,
  title     = {{Learning Spectral Regularizations for Linear Inverse Problems}},
  author    = {Bauermeister, Hartmut and Burger, Martin and Moeller, Michael},
  booktitle = {NeurIPS 2020 Workshops: Deep_Inverse},
  year      = {2020},
  url       = {https://mlanthology.org/neuripsw/2020/bauermeister2020neuripsw-learning/}
}