The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM
Abstract
We introduce the Stochastic Asynchronous Proximal Alternating Linearized Minimization (SAPALM) method, a block coordinate stochastic proximal-gradient method for solving nonconvex, nonsmooth optimization problems. SAPALM is the first asynchronous parallel optimization method that provably converges on a large class of nonconvex, nonsmooth problems. We prove that SAPALM matches the best known rates of convergence --- among synchronous or asynchronous methods --- on this problem class. We provide upper bounds on the number of workers for which we can expect to see a linear speedup, which match the best bounds known for less complex problems, and show that in practice SAPALM achieves this linear speedup. We demonstrate state-of-the-art performance on several matrix factorization problems.
Cite
Text
Davis et al. "The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM." Neural Information Processing Systems, 2016.Markdown
[Davis et al. "The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/davis2016neurips-sound/)BibTeX
@inproceedings{davis2016neurips-sound,
title = {{The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM}},
author = {Davis, Damek and Edmunds, Brent and Udell, Madeleine},
booktitle = {Neural Information Processing Systems},
year = {2016},
pages = {226-234},
url = {https://mlanthology.org/neurips/2016/davis2016neurips-sound/}
}