Adaptive Sampling for Incremental Optimization Using Stochastic Gradient Descent
Abstract
A wide collection of popular statistical learning methods, ranging from K -means to Support Vector Machines through Neural Networks, can be formulated as a stochastic gradient descent (SGD) algorithm in a specific setup. In practice, the main limitation of this incremental optimization technique is due to the stochastic noise induced by the choice at random of the data involved in the gradient estimate computation at each iteration. It is the purpose of this paper to introduce a novel implementation of the SGD algorithm, where the data subset used at a given step is not picked uniformly at random among all possible subsets but drawn from a specific adaptive sampling scheme, depending on the past iterations in a Markovian manner, in order to refine the current statistical estimation of the gradient. Beyond an algorithmic description of the approach we propose, rate bounds are established and illustrative numerical results are displayed in order to provide theoretical and empirical evidence of its statistical performance, compared to more “naive” SGD implementations. Computational issues are also discussed at length, revealing the practical advantages of the method promoted.
Cite
Text
Papa et al. "Adaptive Sampling for Incremental Optimization Using Stochastic Gradient Descent." International Conference on Algorithmic Learning Theory, 2015. doi:10.1007/978-3-319-24486-0_21Markdown
[Papa et al. "Adaptive Sampling for Incremental Optimization Using Stochastic Gradient Descent." International Conference on Algorithmic Learning Theory, 2015.](https://mlanthology.org/alt/2015/papa2015alt-adaptive/) doi:10.1007/978-3-319-24486-0_21BibTeX
@inproceedings{papa2015alt-adaptive,
title = {{Adaptive Sampling for Incremental Optimization Using Stochastic Gradient Descent}},
author = {Papa, Guillaume and Bianchi, Pascal and Clémençon, Stéphan},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2015},
pages = {317-331},
doi = {10.1007/978-3-319-24486-0_21},
url = {https://mlanthology.org/alt/2015/papa2015alt-adaptive/}
}