Precise Asymptotics for Phase Retrieval and Compressed Sensing with Random Generative Priors
Abstract
We consider the problem of compressed sensing and of (real-valued) phase retrieval with random measurement matrix. We analyse sharp asymptotics of the information-theoretically optimal performance and that of the best known polynomial algorithms under a generative prior consisting of a single layer neural network with a random weight matrix. We compare the performance to sparse separable priors and conclude that generative priors might be advantageous in terms of algorithmic performance. In particular, while sparsity does not allow to perform compressive phase retrieval efficiently close to its information-theoretic limit, it is found that under the random generative prior compressed phase retrieval becomes tractable.
Cite
Text
Aubin et al. "Precise Asymptotics for Phase Retrieval and Compressed Sensing with Random Generative Priors." NeurIPS 2019 Workshops: Deep_Inverse, 2019.Markdown
[Aubin et al. "Precise Asymptotics for Phase Retrieval and Compressed Sensing with Random Generative Priors." NeurIPS 2019 Workshops: Deep_Inverse, 2019.](https://mlanthology.org/neuripsw/2019/aubin2019neuripsw-precise/)BibTeX
@inproceedings{aubin2019neuripsw-precise,
title = {{Precise Asymptotics for Phase Retrieval and Compressed Sensing with Random Generative Priors}},
author = {Aubin, Benjamin and Loureiro, Bruno and Baker, Antoine and Krzakala, Florent and Zdeborova, Lenka},
booktitle = {NeurIPS 2019 Workshops: Deep_Inverse},
year = {2019},
url = {https://mlanthology.org/neuripsw/2019/aubin2019neuripsw-precise/}
}