Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning
Abstract
Performance of distributed optimization and learning systems is bottlenecked by “straggler” nodes and slow communication links, which significantly delay computation. We propose a distributed optimization framework where the dataset is “encoded” to have an over-complete representation with built-in redundancy, and the straggling nodes in the system are dynamically treated as missing, or as “erasures” at every iteration, whose loss is compensated by the embedded redundancy. For quadratic loss functions, we show that under a simple encoding scheme, many optimization algorithms (gradient descent, L-BFGS, and proximal gradient) operating under data parallelism converge to an approximate solution even when stragglers are ignored. Furthermore, we show a similar result for a wider class of convex loss functions when operating under model parallelism. The applicable classes of objectives covers several popular learning problems such as linear regression, LASSO, support vector machine, collaborative filtering, and generalized linear models including logistic regression. These convergence results are deterministic, i.e., they establish sample path convergence for arbitrary sequences of delay patterns or distributions on the nodes, and are independent of the tail behavior of the delay distribution. We demonstrate that equiangular tight frames have desirable properties as encoding matrices, and propose efficient mechanisms for encoding large-scale data. We implement the proposed technique on Amazon EC2 clusters, and demonstrate its performance over several learning problems, including matrix factorization, LASSO, ridge regression and logistic regression, and compare the proposed method with uncoded, asynchronous, and data replication strategies.
Cite
Text
Karakus et al. "Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning." Journal of Machine Learning Research, 2019.Markdown
[Karakus et al. "Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning." Journal of Machine Learning Research, 2019.](https://mlanthology.org/jmlr/2019/karakus2019jmlr-redundancy/)BibTeX
@article{karakus2019jmlr-redundancy,
title = {{Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning}},
author = {Karakus, Can and Sun, Yifan and Diggavi, Suhas and Yin, Wotao},
journal = {Journal of Machine Learning Research},
year = {2019},
pages = {1-47},
volume = {20},
url = {https://mlanthology.org/jmlr/2019/karakus2019jmlr-redundancy/}
}