Statistical Dynamics of Batch Learning
Abstract
An important issue in neural computing concerns the description of learning dynamics with macroscopic dynamical variables. Recen(cid:173) t progress on on-line learning only addresses the often unrealistic case of an infinite training set. We introduce a new framework to model batch learning of restricted sets of examples, widely applica(cid:173) ble to any learning cost function, and fully taking into account the temporal correlations introduced by the recycling of the examples. For illustration we analyze the effects of weight decay and early stopping during the learning of teacher-generated examples.
Cite
Text
Li and Wong. "Statistical Dynamics of Batch Learning." Neural Information Processing Systems, 1999.Markdown
[Li and Wong. "Statistical Dynamics of Batch Learning." Neural Information Processing Systems, 1999.](https://mlanthology.org/neurips/1999/li1999neurips-statistical/)BibTeX
@inproceedings{li1999neurips-statistical,
title = {{Statistical Dynamics of Batch Learning}},
author = {Li, Song and Wong, K. Y. Michael},
booktitle = {Neural Information Processing Systems},
year = {1999},
pages = {286-292},
url = {https://mlanthology.org/neurips/1999/li1999neurips-statistical/}
}