Communication Lower Bounds for Distributed Convex Optimization: Partition Data on Features
Abstract
Recently, there has been an increasing interest in designing distributed convex optimization algorithms under the setting where the data matrix is partitioned on features. Algorithms under this setting sometimes have many advantages over those under the setting where data is partitioned on samples, especially when the number of features is huge. Therefore, it is important to understand the inherent limitations of these optimization problems. In this paper, with certain restrictions on the communication allowed in the procedures, we develop tight lower bounds on communication rounds for a broad class of non-incremental algorithms under this setting. We also provide a lower bound on communication rounds for a class of (randomized) incremental algorithms.
Cite
Text
Chen et al. "Communication Lower Bounds for Distributed Convex Optimization: Partition Data on Features." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.10912Markdown
[Chen et al. "Communication Lower Bounds for Distributed Convex Optimization: Partition Data on Features." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/chen2017aaai-communication/) doi:10.1609/AAAI.V31I1.10912BibTeX
@inproceedings{chen2017aaai-communication,
title = {{Communication Lower Bounds for Distributed Convex Optimization: Partition Data on Features}},
author = {Chen, Zihao and Luo, Luo and Zhang, Zhihua},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2017},
pages = {1812-1818},
doi = {10.1609/AAAI.V31I1.10912},
url = {https://mlanthology.org/aaai/2017/chen2017aaai-communication/}
}