One-Bit Compressed Sensing: Provable Support and Vector Recovery
Abstract
In this paper, we study the problem of one-bit compressed sensing (1-bit CS), where the goal is to design a measurement matrix A and a recovery algorithm s.t. a k-sparse vector \x^* can be efficiently recovered back from signed linear measurements, i.e., b=\sign(A\x^*). This is an important problem in the signal acquisition area and has several learning applications as well, e.g., multi-label classification \citeHsuKLZ10. We study this problem in two settings: a) support recovery: recover \supp(\x^*), b) approximate vector recovery: recover a unit vector \hx s.t. || \hatx-\x^*/||\x^*|| ||_2≤ε. For support recovery, we propose two novel and efficient solutions based on two combinatorial structures: union free family of sets and expanders. In contrast to existing methods for support recovery, our methods are universal i.e. a single measurement matrix A can recover almost all the signals. For approximate recovery, we propose the first method to recover sparse vector using a near optimal number of measurements. We also empirically demonstrate effectiveness of our algorithms; we show that our algorithms are able to recover signals with smaller number of measurements than several existing methods.
Cite
Text
Gopi et al. "One-Bit Compressed Sensing: Provable Support and Vector Recovery." International Conference on Machine Learning, 2013.Markdown
[Gopi et al. "One-Bit Compressed Sensing: Provable Support and Vector Recovery." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/gopi2013icml-onebit/)BibTeX
@inproceedings{gopi2013icml-onebit,
title = {{One-Bit Compressed Sensing: Provable Support and Vector Recovery}},
author = {Gopi, Sivakant and Netrapalli, Praneeth and Jain, Prateek and Nori, Aditya},
booktitle = {International Conference on Machine Learning},
year = {2013},
pages = {154-162},
volume = {28},
url = {https://mlanthology.org/icml/2013/gopi2013icml-onebit/}
}