Fast Computation of Leave-One-Out Cross-Validation for $k$-NN Regression
Abstract
We describe a fast computation method for leave-one-out cross-validation (LOOCV) for $k$-nearest neighbours ($k$-NN) regression. We show that, under a tie-breaking condition for nearest neighbours, the LOOCV estimate of the mean square error for $k$-NN regression is identical to the mean square error of $(k+1)$-NN regression evaluated on the training data, multiplied by the scaling factor $(k+1)^2/k^2$. Therefore, to compute the LOOCV score, one only needs to fit $(k+1)$-NN regression only once, and does not need to repeat training-validation of $k$-NN regression for the number of training data. Numerical experiments confirm the validity of the fast computation method.
Cite
Text
Kanagawa. "Fast Computation of Leave-One-Out Cross-Validation for $k$-NN Regression." Transactions on Machine Learning Research, 2024.Markdown
[Kanagawa. "Fast Computation of Leave-One-Out Cross-Validation for $k$-NN Regression." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/kanagawa2024tmlr-fast/)BibTeX
@article{kanagawa2024tmlr-fast,
title = {{Fast Computation of Leave-One-Out Cross-Validation for $k$-NN Regression}},
author = {Kanagawa, Motonobu},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/kanagawa2024tmlr-fast/}
}