Breaking Locality Accelerates Block Gauss-Seidel
Abstract
Recent work by Nesterov and Stich (2016) showed that momentum can be used to accelerate the rate of convergence for block Gauss-Seidel in the setting where a fixed partitioning of the coordinates is chosen ahead of time. We show that this setting is too restrictive, constructing instances where breaking locality by running non-accelerated Gauss-Seidel with randomly sampled coordinates substantially outperforms accelerated Gauss-Seidel with any fixed partitioning. Motivated by this finding, we analyze the accelerated block Gauss-Seidel algorithm in the random coordinate sampling setting. Our analysis captures the benefit of acceleration with a new data-dependent parameter which is well behaved when the matrix sub-blocks are well-conditioned. Empirically, we show that accelerated Gauss-Seidel with random coordinate sampling provides speedups for large scale machine learning tasks when compared to non-accelerated Gauss-Seidel and the classical conjugate-gradient algorithm.
Cite
Text
Tu et al. "Breaking Locality Accelerates Block Gauss-Seidel." International Conference on Machine Learning, 2017.Markdown
[Tu et al. "Breaking Locality Accelerates Block Gauss-Seidel." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/tu2017icml-breaking/)BibTeX
@inproceedings{tu2017icml-breaking,
title = {{Breaking Locality Accelerates Block Gauss-Seidel}},
author = {Tu, Stephen and Venkataraman, Shivaram and Wilson, Ashia C. and Gittens, Alex and Jordan, Michael I. and Recht, Benjamin},
booktitle = {International Conference on Machine Learning},
year = {2017},
pages = {3482-3491},
volume = {70},
url = {https://mlanthology.org/icml/2017/tu2017icml-breaking/}
}