Mini-Batch Primal and Dual Methods for SVMs
Abstract
We address the issue of using mini-batches in stochastic optimization of SVMs. We show that the same quantity, the spectral norm of the data, controls the parallelization speedup obtained for both primal stochastic subgradient descent(SGD) and stochastic dual coordinate ascent (SCDA) methods and use it to derive novel variants of mini-batched SDCA. Our guarantees for both methods are expressed in terms of the original nonsmooth primal problem based on the hinge-loss.
Cite
Text
Takac et al. "Mini-Batch Primal and Dual Methods for SVMs." International Conference on Machine Learning, 2013.Markdown
[Takac et al. "Mini-Batch Primal and Dual Methods for SVMs." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/takac2013icml-minibatch/)BibTeX
@inproceedings{takac2013icml-minibatch,
title = {{Mini-Batch Primal and Dual Methods for SVMs}},
author = {Takac, Martin and Bijral, Avleen and Richtarik, Peter and Srebro, Nati},
booktitle = {International Conference on Machine Learning},
year = {2013},
pages = {1022-1030},
volume = {28},
url = {https://mlanthology.org/icml/2013/takac2013icml-minibatch/}
}