Selective Nonparametric Regression via Testing

Abstract

Prediction with the possibility of abstention (or selective prediction) is an important problem for error-critical machine learning applications. While well-studied in the classification setup, selective approaches to regression are much less developed. In this work, we consider the nonparametric heteroskedastic regression problem and develop an abstention procedure via testing the hypothesis on the value of the conditional variance at a given point. Unlike existing methods, the proposed one allows to account not only for the value of the variance itself but also for the uncertainty of the corresponding variance predictor. We prove non-asymptotic bounds on the risk of the resulting estimator and show the existence of several different convergence regimes. Theoretical analysis is illustrated with a series of experiments on simulated and real-world data.

Cite

Text

Noskov et al. "Selective Nonparametric Regression via Testing." Proceedings of the 15th Asian Conference on Machine Learning, 2023.

Markdown

[Noskov et al. "Selective Nonparametric Regression via Testing." Proceedings of the 15th Asian Conference on Machine Learning, 2023.](https://mlanthology.org/acml/2023/noskov2023acml-selective/)

BibTeX

@inproceedings{noskov2023acml-selective,
  title     = {{Selective Nonparametric Regression via Testing}},
  author    = {Noskov, Fedor and Fishkov, Alexander and Panov, Maxim},
  booktitle = {Proceedings of the 15th Asian Conference on Machine Learning},
  year      = {2023},
  pages     = {1023-1038},
  volume    = {222},
  url       = {https://mlanthology.org/acml/2023/noskov2023acml-selective/}
}