Variance Reduced Coordinate Descent with Acceleration: New Method with a Surprising Application to Finite-Sum Problems
Abstract
We propose an accelerated version of stochastic variance reduced coordinate descent – ASVRCD. As other variance reduced coordinate descent methods such as SEGA or SVRCD, our method can deal with problems that include a non-separable and non-smooth regularizer, while accessing a random block of partial derivatives in each iteration only. However, ASVRCD incorporates Nesterov’s momentum, which offers favorable iteration complexity guarantees over both SEGA and SVRCD. As a by-product of our theory, we show that a variant of Katyusha (Allen-Zhu, 2017) is a specific case of ASVRCD, recovering the optimal oracle complexity for the finite sum objective.
Cite
Text
Hanzely et al. "Variance Reduced Coordinate Descent with Acceleration: New Method with a Surprising Application to Finite-Sum Problems." International Conference on Machine Learning, 2020.Markdown
[Hanzely et al. "Variance Reduced Coordinate Descent with Acceleration: New Method with a Surprising Application to Finite-Sum Problems." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/hanzely2020icml-variance/)BibTeX
@inproceedings{hanzely2020icml-variance,
title = {{Variance Reduced Coordinate Descent with Acceleration: New Method with a Surprising Application to Finite-Sum Problems}},
author = {Hanzely, Filip and Kovalev, Dmitry and Richtarik, Peter},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {4039-4048},
volume = {119},
url = {https://mlanthology.org/icml/2020/hanzely2020icml-variance/}
}