Distributed Nyström Kernel Learning with Communications
Abstract
We study the statistical performance for distributed kernel ridge regression with Nyström (DKRR-NY) and with Nyström and iterative solvers (DKRR-NY-PCG) and successfully derive the optimal learning rates, which can improve the ranges of the number of local processors $p$ to the optimal in existing state-of-art bounds. More precisely, our theoretical analysis show that DKRR-NY and DKRR-NY-PCG achieve the same learning rates as the exact KRR requiring essentially $\mathcal{O}(|D|^{1.5})$ time and $\mathcal{O}(|D|)$ memory with relaxing the restriction on $p$ in expectation, where $|D|$ is the number of data, which exhibits the average effectiveness of multiple trials. Furthermore, for showing the generalization performance in a single trial, we deduce the learning rates for DKRR-NY and DKRR-NY-PCG in probability. Finally, we propose a novel algorithm DKRR-NY-CM based on DKRR-NY, which employs a communication strategy to further improve the learning performance, whose effectiveness of communications is validated in theoretical and experimental analysis.
Cite
Text
Yin et al. "Distributed Nyström Kernel Learning with Communications." International Conference on Machine Learning, 2021.Markdown
[Yin et al. "Distributed Nyström Kernel Learning with Communications." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/yin2021icml-distributed/)BibTeX
@inproceedings{yin2021icml-distributed,
title = {{Distributed Nyström Kernel Learning with Communications}},
author = {Yin, Rong and Wang, Weiping and Meng, Dan},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {12019-12028},
volume = {139},
url = {https://mlanthology.org/icml/2021/yin2021icml-distributed/}
}