Stochastic Gradient Flow Dynamics of Test Risk and Its Exact Solution for Weak Features
Abstract
We investigate the test risk of a continuous time stochastic gradient flow dynamics in learning theory. Using a path integral formulation we provide, in the regime of small learning rate, a general formula for computing the difference between test risk curves of pure gradient and stochastic gradient flows. We apply the general theory to a simple model of weak features, which displays the double descent phenomenon, and explicitly compute the corrections brought about by the added stochastic term in the dynamics, as a function of time and model parameters. The analytical results are compared to simulations of discrete time stochastic gradient descent and show good agreement.
Cite
Text
Veiga et al. "Stochastic Gradient Flow Dynamics of Test Risk and Its Exact Solution for Weak Features." International Conference on Machine Learning, 2024.Markdown
[Veiga et al. "Stochastic Gradient Flow Dynamics of Test Risk and Its Exact Solution for Weak Features." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/veiga2024icml-stochastic/)BibTeX
@inproceedings{veiga2024icml-stochastic,
title = {{Stochastic Gradient Flow Dynamics of Test Risk and Its Exact Solution for Weak Features}},
author = {Veiga, Rodrigo and Remizova, Anastasia and Macris, Nicolas},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {49310-49344},
volume = {235},
url = {https://mlanthology.org/icml/2024/veiga2024icml-stochastic/}
}