Sparse Bayesian Generative Modeling for Compressive Sensing

Abstract

This work addresses the fundamental linear inverse problem in compressive sensing (CS) by introducing a new type of regularizing generative prior. Our proposed method utilizes ideas from classical dictionary-based CS and, in particular, sparse Bayesian learning (SBL), to integrate a strong regularization towards sparse solutions. At the same time, by leveraging the notion of conditional Gaussianity, it also incorporates the adaptability from generative models to training data. However, unlike most state-of-the-art generative models, it is able to learn from a few compressed and noisy data samples and requires no optimization algorithm for solving the inverse problem. Additionally, similar to Dirichlet prior networks, our model parameterizes a conjugate prior enabling its application for uncertainty quantification. We support our approach theoretically through the concept of variational inference and validate it empirically using different types of compressible signals.

Cite

Text

Böck et al. "Sparse Bayesian Generative Modeling for Compressive Sensing." Neural Information Processing Systems, 2024. doi:10.52202/079017-0151

Markdown

[Böck et al. "Sparse Bayesian Generative Modeling for Compressive Sensing." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/bock2024neurips-sparse/) doi:10.52202/079017-0151

BibTeX

@inproceedings{bock2024neurips-sparse,
  title     = {{Sparse Bayesian Generative Modeling for Compressive Sensing}},
  author    = {Böck, Benedikt and Syed, Sadaf and Utschick, Wolfgang},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0151},
  url       = {https://mlanthology.org/neurips/2024/bock2024neurips-sparse/}
}