No Free Lunch for Early Stopping
Abstract
We show that with a uniform prior on models having the same training error, early stopping at some fixed training error above the training error minimum results in an increase in the expected generalization error.
Cite
Text
Cataltepe et al. "No Free Lunch for Early Stopping." Neural Computation, 1999. doi:10.1162/089976699300016557Markdown
[Cataltepe et al. "No Free Lunch for Early Stopping." Neural Computation, 1999.](https://mlanthology.org/neco/1999/cataltepe1999neco-free/) doi:10.1162/089976699300016557BibTeX
@article{cataltepe1999neco-free,
title = {{No Free Lunch for Early Stopping}},
author = {Cataltepe, Zehra and Abu-Mostafa, Yaser S. and Magdon-Ismail, Malik},
journal = {Neural Computation},
year = {1999},
pages = {995-1009},
doi = {10.1162/089976699300016557},
volume = {11},
url = {https://mlanthology.org/neco/1999/cataltepe1999neco-free/}
}