Gradient Descent with Sparsification: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property
Abstract
In this paper, we present an algorithm for finding an $s$-sparse vector $x$ that minimizes the {\em square-error} $\| y - \Phi x \|^2$ where $\Phi$ satisfies the {\em restricted isometry property} (RIP). Our algorithm, called {\em {\GraDeS}} (Gradient Descent with Sparsification) starts from an arbitrary $s$-sparse $x$ and iteratively updates it as: $x \leftarrow H_s\left(x + \frac{1}{\gamma} \cdot \Phi^\t(y - \Phi x)\right)$ where $\gamma > 1$ is a constant and $H_s$ sets all but largest $s$ coordinates in absolute value to zero. We show that {\GraDeS}, in constant number of iterations, computes the correct $s$-sparse solution to the system $y=\Phi x$ where $\Phi$ satisfies the condition that the {\em isometric constant} $\delta_{2s} < 1/3$. This is the most general condition for which, {\em near-linear time} algorithm is known. In comparison, the best condition under which any polynomial-time algorithm is known, is $\delta_{2s} < \sqrt{2}-1$. An important contribution of the paper is to analyze how the hard-thresholding function $H_s$ acts w.r.t. the potential $\|y - \Phi x\|^2$. A special case of {\GraDeS}, corresponding to $\gamma = 1$, called {\em Iterative Hard Thresholding (IHT)}, was previously shown to converge when $\delta_{3s} < 1/\sqrt{32}$~\cite{IHT:BD08}. Our Matlab implementation of {\GraDeS} out-performs previously proposed algorithms like Subspace Pursuit, StOMP, OMP, and Lasso by an order of magnitude. Curiously, our experiments also uncovered several cases where L1-regularized regression (Lasso) fails but {\GraDeS} finds the correct solution.
Cite
Text
Garg and Khandekar. "Gradient Descent with Sparsification: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553417Markdown
[Garg and Khandekar. "Gradient Descent with Sparsification: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/garg2009icml-gradient/) doi:10.1145/1553374.1553417BibTeX
@inproceedings{garg2009icml-gradient,
title = {{Gradient Descent with Sparsification: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property}},
author = {Garg, Rahul and Khandekar, Rohit},
booktitle = {International Conference on Machine Learning},
year = {2009},
pages = {337-344},
doi = {10.1145/1553374.1553417},
url = {https://mlanthology.org/icml/2009/garg2009icml-gradient/}
}