Early Stopping for Nonparametric Testing
Abstract
Early stopping of iterative algorithms is an algorithmic regularization method to avoid over-fitting in estimation and classification. In this paper, we show that early stopping can also be applied to obtain the minimax optimal testing in a general non-parametric setup. Specifically, a Wald-type test statistic is obtained based on an iterated estimate produced by functional gradient descent algorithms in a reproducing kernel Hilbert space. A notable contribution is to establish a ``sharp'' stopping rule: when the number of iterations achieves an optimal order, testing optimality is achievable; otherwise, testing optimality becomes impossible. As a by-product, a similar sharpness result is also derived for minimax optimal estimation under early stopping. All obtained results hold for various kernel classes, including Sobolev smoothness classes and Gaussian kernel classes.
Cite
Text
Liu and Cheng. "Early Stopping for Nonparametric Testing." Neural Information Processing Systems, 2018.Markdown
[Liu and Cheng. "Early Stopping for Nonparametric Testing." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/liu2018neurips-early/)BibTeX
@inproceedings{liu2018neurips-early,
title = {{Early Stopping for Nonparametric Testing}},
author = {Liu, Meimei and Cheng, Guang},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {3985-3994},
url = {https://mlanthology.org/neurips/2018/liu2018neurips-early/}
}