Communication-Efficient Distributed Online Learning with Kernels

Abstract

We propose an efficient distributed online learning protocol for low-latency real-time services. It extends a previously presented protocol to kernelized online learners that represent their models by a support vector expansion. While such learners often achieve higher predictive performance than their linear counterparts, communicating the support vector expansions becomes inefficient for large numbers of support vectors. The proposed extension allows for a larger class of online learning algorithms—including those alleviating the problem above through model compression. In addition, we characterize the quality of the proposed protocol by introducing a novel criterion that requires the communication to be bounded by the loss suffered.

Cite

Text

Kamp et al. "Communication-Efficient Distributed Online Learning with Kernels." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2016. doi:10.1007/978-3-319-46227-1_50

Markdown

[Kamp et al. "Communication-Efficient Distributed Online Learning with Kernels." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2016.](https://mlanthology.org/ecmlpkdd/2016/kamp2016ecmlpkdd-communicationefficient/) doi:10.1007/978-3-319-46227-1_50

BibTeX

@inproceedings{kamp2016ecmlpkdd-communicationefficient,
  title     = {{Communication-Efficient Distributed Online Learning with Kernels}},
  author    = {Kamp, Michael and Bothe, Sebastian and Boley, Mario and Mock, Michael},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2016},
  pages     = {805-819},
  doi       = {10.1007/978-3-319-46227-1_50},
  url       = {https://mlanthology.org/ecmlpkdd/2016/kamp2016ecmlpkdd-communicationefficient/}
}