Learning a Compressed Sensing Measurement Matrix via Gradient Unrolling
Abstract
Linear encoding of sparse vectors is widely popular, but is commonly data-independent – missing any possible extra (but a priori unknown) structure beyond sparsity. In this paper we present a new method to learn linear encoders that adapt to data, while still performing well with the widely used $\ell_1$ decoder. The convex $\ell_1$ decoder prevents gradient propagation as needed in standard gradient-based training. Our method is based on the insight that unrolling the convex decoder into $T$ projected subgradient steps can address this issue. Our method can be seen as a data-driven way to learn a compressed sensing measurement matrix. We compare the empirical performance of 10 algorithms over 6 sparse datasets (3 synthetic and 3 real). Our experiments show that there is indeed additional structure beyond sparsity in the real datasets; our method is able to discover it and exploit it to create excellent reconstructions with fewer measurements (by a factor of 1.1-3x) compared to the previous state-of-the-art methods. We illustrate an application of our method in learning label embeddings for extreme multi-label classification, and empirically show that our method is able to match or outperform the precision scores of SLEEC, which is one of the state-of-the-art embedding-based approaches.
Cite
Text
Wu et al. "Learning a Compressed Sensing Measurement Matrix via Gradient Unrolling." International Conference on Machine Learning, 2019.Markdown
[Wu et al. "Learning a Compressed Sensing Measurement Matrix via Gradient Unrolling." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/wu2019icml-learning/)BibTeX
@inproceedings{wu2019icml-learning,
title = {{Learning a Compressed Sensing Measurement Matrix via Gradient Unrolling}},
author = {Wu, Shanshan and Dimakis, Alex and Sanghavi, Sujay and Yu, Felix and Holtmann-Rice, Daniel and Storcheus, Dmitry and Rostamizadeh, Afshin and Kumar, Sanjiv},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {6828-6839},
volume = {97},
url = {https://mlanthology.org/icml/2019/wu2019icml-learning/}
}