Error Dynamics of Mini-Batch Gradient Descent with Random Reshuffling for Least Squares Regression
Abstract
We study the discrete dynamics of mini-batch gradient descent with random reshuffling for least squares regression. We show that the training and generalization errors depend on a sample cross-covariance matrix $Z$ between the original features $X$ and a set of new features $\widetilde{X}$ in which each feature is modified by the mini-batches that appear before it during the learning process in an averaged way. Using this representation, we establish that the dynamics of mini-batch and full-batch gradient descent agree up to leading order with respect to the step size using the linear scaling rule. However, mini-batch gradient descent with random reshuffling exhibits a subtle dependence on the step size that a gradient flow analysis cannot detect, such as converging to a limit that depends on the step size. By comparing $Z$, a non-commutative polynomial of random matrices, with the sample covariance matrix of $X$ asymptotically, we demonstrate that batching affects the dynamics by resulting in a form of shrinkage on the spectrum.
Cite
Text
Lok et al. "Error Dynamics of Mini-Batch Gradient Descent with Random Reshuffling for Least Squares Regression." Proceedings of The 36th International Conference on Algorithmic Learning Theory, 2025.Markdown
[Lok et al. "Error Dynamics of Mini-Batch Gradient Descent with Random Reshuffling for Least Squares Regression." Proceedings of The 36th International Conference on Algorithmic Learning Theory, 2025.](https://mlanthology.org/alt/2025/lok2025alt-error/)BibTeX
@inproceedings{lok2025alt-error,
title = {{Error Dynamics of Mini-Batch Gradient Descent with Random Reshuffling for Least Squares Regression}},
author = {Lok, Jackie and Sonthalia, Rishi and Rebrova, Elizaveta},
booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory},
year = {2025},
pages = {736-770},
volume = {272},
url = {https://mlanthology.org/alt/2025/lok2025alt-error/}
}