Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices
Abstract
Linear sketching is a powerful tool for the problem of sparse signal recovery, having numerous applications such as compressive sensing, data stream computing, graph sketching, and routing. Motivated by applications where the \emphpositions of the non-zero entries in a sparse vector are of primary interest, we consider the problem of \emphsupport recovery from a linear sketch taking the form \mathbf{Y} = \mathbf{X}β+ \mathbf{Z}. We focus on a widely-used expander-based construction in the columns of the measurement matrix \mathbf{X} ∈\mathbb{R}^n \times p are random permutations of a sparse binary vector containing d ≪n ones and n-d zeros. We provide a sharp characterization of the number of measurements required for an information-theoretically optimal decoder, thus permitting a precise comparison to the i.i.d. Gaussian construction. Our findings reveal both positive and negative results, showing that the performance nearly matches the Gaussian construction at moderate-to-high noise levels, while being worse by an arbitrarily large factor at low noise levels.
Cite
Text
Scarlett and Cevher. "Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices." International Conference on Artificial Intelligence and Statistics, 2016.Markdown
[Scarlett and Cevher. "Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices." International Conference on Artificial Intelligence and Statistics, 2016.](https://mlanthology.org/aistats/2016/scarlett2016aistats-limits/)BibTeX
@inproceedings{scarlett2016aistats-limits,
title = {{Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices}},
author = {Scarlett, Jonathan and Cevher, Volkan},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2016},
pages = {149-158},
url = {https://mlanthology.org/aistats/2016/scarlett2016aistats-limits/}
}