Variance-Reduced Stochastic Gradient Descent on Streaming Data
Abstract
We present an algorithm STRSAGA for efficiently maintaining a machine learning model over data points that arrive over time, quickly updating the model as new training data is observed. We present a competitive analysis comparing the sub-optimality of the model maintained by STRSAGA with that of an offline algorithm that is given the entire data beforehand, and analyze the risk-competitiveness of STRSAGA under different arrival patterns. Our theoretical and experimental results show that the risk of STRSAGA is comparable to that of offline algorithms on a variety of input arrival patterns, and its experimental performance is significantly better than prior algorithms suited for streaming data, such as SGD and SSVRG.
Cite
Text
Jothimurugesan et al. "Variance-Reduced Stochastic Gradient Descent on Streaming Data." Neural Information Processing Systems, 2018.Markdown
[Jothimurugesan et al. "Variance-Reduced Stochastic Gradient Descent on Streaming Data." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/jothimurugesan2018neurips-variancereduced/)BibTeX
@inproceedings{jothimurugesan2018neurips-variancereduced,
title = {{Variance-Reduced Stochastic Gradient Descent on Streaming Data}},
author = {Jothimurugesan, Ellango and Tahmasbi, Ashraf and Gibbons, Phillip and Tirthapura, Srikanta},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {9906-9915},
url = {https://mlanthology.org/neurips/2018/jothimurugesan2018neurips-variancereduced/}
}