Generalization Properties and Implicit Regularization for Multiple Passes SGM

Abstract

We study the generalization properties of stochastic gradient methods for learning with convex loss functions and linearly parameterized functions. We show that, in the absence of penalizations or constraints, the stability and approximation properties of the algorithm can be controlled by tuning either the step-size or the number of passes over the data. In this view, these parameters can be seen to control a form of implicit regularization. Numerical results complement the theoretical findings.

Cite

Text

Lin et al. "Generalization Properties and Implicit Regularization for Multiple Passes SGM." International Conference on Machine Learning, 2016.

Markdown

[Lin et al. "Generalization Properties and Implicit Regularization for Multiple Passes SGM." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/lin2016icml-generalization/)

BibTeX

@inproceedings{lin2016icml-generalization,
  title     = {{Generalization Properties and Implicit Regularization for Multiple Passes SGM}},
  author    = {Lin, Junhong and Camoriano, Raffaello and Rosasco, Lorenzo},
  booktitle = {International Conference on Machine Learning},
  year      = {2016},
  pages     = {2340-2348},
  volume    = {48},
  url       = {https://mlanthology.org/icml/2016/lin2016icml-generalization/}
}