Minimum Discrepancy Principle Strategy for Choosing K in k-NN Regression

Abstract

We present a novel data-driven strategy to choose the hyperparameter k in the k-NN regression estimator without using any hold-out data. We treat the problem of choosing the hyperparameter as an iterative procedure (over k) and propose using an easily implemented in practice strategy based on the idea of early stopping and the minimum discrepancy principle. This model selection strategy is proven to be minimax-optimal, under the fixed-design assumption on covariates, over some smoothness function classes, for instance, the Lipschitz functions class on a bounded domain. The novel method often improves statistical performance on artificial and real-world data sets in comparison to other model selection strategies, such as the Hold-out method, 5–fold cross-validation, and AIC criterion. The novelty of the strategy comes from reducing the computational time of the model selection procedure while preserving the statistical (minimax) optimality of the resulting estimator. More precisely, given a sample of size n, if one should choose k among 1,…,n\documentclass[12pt]minimal \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}-69pt \begin{document}$\left\{ 1, \ldots , n \right\}$\end{document}, and f1,…,fn\documentclass[12pt]minimal \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}-69pt \begin{document}$\left\{ f^1, \ldots , f^n \right\}$\end{document} are the estimators of the regression function, the minimum discrepancy principle requires calculation of a fraction of the estimators, while this is not the case for the generalized cross-validation, Akaike’s AIC criteria or Lepskii principle.

Cite

Text

Averyanov and Celisse. "Minimum Discrepancy Principle Strategy for Choosing K in k-NN Regression." Machine Learning, 2025. doi:10.1007/S10994-024-06645-5

Markdown

[Averyanov and Celisse. "Minimum Discrepancy Principle Strategy for Choosing K in k-NN Regression." Machine Learning, 2025.](https://mlanthology.org/mlj/2025/averyanov2025mlj-minimum/) doi:10.1007/S10994-024-06645-5

BibTeX

@article{averyanov2025mlj-minimum,
  title     = {{Minimum Discrepancy Principle Strategy for Choosing K in k-NN Regression}},
  author    = {Averyanov, Yaroslav and Celisse, Alain},
  journal   = {Machine Learning},
  year      = {2025},
  pages     = {118},
  doi       = {10.1007/S10994-024-06645-5},
  volume    = {114},
  url       = {https://mlanthology.org/mlj/2025/averyanov2025mlj-minimum/}
}