Neural Network Model Selection Using Asymptotic Jackknife Estimator and Cross-Validation Method

Abstract

Two theorems and a lemma are presented about the use of jackknife es(cid:173) timator and the cross-validation method for model selection. Theorem 1 gives the asymptotic form for the jackknife estimator. Combined with the model selection criterion, this asymptotic form can be used to obtain the fit of a model. The model selection criterion we used is the negative of the average predictive likehood, the choice of which is based on the idea of the cross-validation method. Lemma 1 provides a formula for further explo(cid:173) ration of the asymptotics of the model selection criterion. Theorem 2 gives an asymptotic form of the model selection criterion for the regression case, when the parameters optimization criterion has a penalty term. Theorem 2 also proves the asymptotic equivalence of Moody's model selection cri(cid:173) terion (Moody, 1992) and the cross-validation method, when the distance measure between response y and regression function takes the form of a squared difference.

Cite

Text

Liu. "Neural Network Model Selection Using Asymptotic Jackknife Estimator and Cross-Validation Method." Neural Information Processing Systems, 1992.

Markdown

[Liu. "Neural Network Model Selection Using Asymptotic Jackknife Estimator and Cross-Validation Method." Neural Information Processing Systems, 1992.](https://mlanthology.org/neurips/1992/liu1992neurips-neural/)

BibTeX

@inproceedings{liu1992neurips-neural,
  title     = {{Neural Network Model Selection Using Asymptotic Jackknife Estimator and Cross-Validation Method}},
  author    = {Liu, Yong},
  booktitle = {Neural Information Processing Systems},
  year      = {1992},
  pages     = {599-606},
  url       = {https://mlanthology.org/neurips/1992/liu1992neurips-neural/}
}