Beyond Independent Measurements: General Compressed Sensing with GNN Application
Abstract
We consider the problem of recovering a structured signal $\mathbf{x} \in \mathbb{R}^{n}$ from noisy linear observations $\mathbf{y} =\mathbf{M} \mathbf{x}+\mathbf{w}$. The measurement matrix is modeled as $\mathbf{M} = \mathbf{B}\mathbf{A}$, where $\mathbf{B} \in \mathbb{R}^{l \times m}$ is arbitrary and $\mathbf{A} \in \mathbb{R}^{m \times n}$ has independent sub-gaussian rows. By varying $\mathbf{B}$, and the sub-gaussian distribution of $\mathbf{A}$, this gives a family of measurement matrices which may have heavy tails, dependent rows and columns, and singular values with a large dynamic range. When the structure is given as a possibly non-convex cone $T \subset \mathbb{R}^{n}$, an approximate empirical risk minimizer is proven to be a robust estimator if the effective number of measurements is sufficient, even in the presence of a model mismatch. In classical compressed sensing with independent (sub-)gaussian measurements, one asks \textit{how many measurements are needed to recover $\mathbf{x}$?} In our setting, however, the effective number of measurements depends on the properties of $\mathbf{B}$. We show that the \textit{effective rank} of $\mathbf{B}$ may be used as a surrogate for the number of measurements, and if this exceeds the squared \textit{Gaussian mean width} of $(T-T) \cap \mathbb{S}^{n-1}$, then accurate recovery is guaranteed. Furthermore, we examine the special case of generative priors in detail, that is when $\mathbf{x}$ lies close to $T = \mathrm{ran}(G)$ and $G: \mathbb{R}^k \rightarrow \mathbb{R}^n$ is a Generative Neural Network (GNN) with ReLU activation functions. Our work relies on a recent result in random matrix theory by Jeong, Li, Plan, and Y{\i}lmaz.
Cite
Text
Naderi and Plan. "Beyond Independent Measurements: General Compressed Sensing with GNN Application." NeurIPS 2021 Workshops: Deep_Inverse, 2021.Markdown
[Naderi and Plan. "Beyond Independent Measurements: General Compressed Sensing with GNN Application." NeurIPS 2021 Workshops: Deep_Inverse, 2021.](https://mlanthology.org/neuripsw/2021/naderi2021neuripsw-beyond/)BibTeX
@inproceedings{naderi2021neuripsw-beyond,
title = {{Beyond Independent Measurements: General Compressed Sensing with GNN Application}},
author = {Naderi, Alireza and Plan, Yaniv},
booktitle = {NeurIPS 2021 Workshops: Deep_Inverse},
year = {2021},
url = {https://mlanthology.org/neuripsw/2021/naderi2021neuripsw-beyond/}
}