Approximate Message Passing for Amplitude Based Optimization
Abstract
We consider an $\ell_2$-regularized non-convex optimization problem for recovering signals from their noisy phaseless observations. We design and study the performance of a message passing algorithm that aims to solve this optimization problem. We consider the asymptotic setting $m,n \rightarrow \infty$, $m/n \rightarrow \delta$ and obtain sharp performance bounds, where $m$ is the number of measurements and $n$ is the signal dimension. We show that for complex signals the algorithm can perform accurate recovery with only $m=\left ( \frac{64}{\pi^2}-4\right)n\approx 2.5n$ measurements. Also, we provide sharp analysis on the sensitivity of the algorithm to noise. We highlight the following facts about our message passing algorithm: (i) Adding $\ell_2$ regularization to the non-convex loss function can be beneficial even in the noiseless setting; (ii) spectral initialization has marginal impact on the performance of the algorithm.
Cite
Text
Ma et al. "Approximate Message Passing for Amplitude Based Optimization." International Conference on Machine Learning, 2018.Markdown
[Ma et al. "Approximate Message Passing for Amplitude Based Optimization." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/ma2018icml-approximate/)BibTeX
@inproceedings{ma2018icml-approximate,
title = {{Approximate Message Passing for Amplitude Based Optimization}},
author = {Ma, Junjie and Xu, Ji and Maleki, Arian},
booktitle = {International Conference on Machine Learning},
year = {2018},
pages = {3365-3374},
volume = {80},
url = {https://mlanthology.org/icml/2018/ma2018icml-approximate/}
}