Competing Against the Best Nearest Neighbor Filter in Regression

Abstract

Designing statistical procedures that are provably almost as accurate as the best one in a given family is one of central topics in statistics and learning theory. Oracle inequalities offer then a convenient theoretical framework for evaluating different strategies, which can be roughly classified into two classes: selection and aggregation strategies. The ultimate goal is to design strategies satisfying oracle inequalities with leading constant one and rate-optimal residual term. In many recent papers, this problem is addressed in the case where the aim is to beat the best procedure from a given family of linear smoothers. However, the theory developed so far either does not cover the important case of nearest-neighbor smoothers or provides a suboptimal oracle inequality with a leading constant considerably larger than one. In this paper, we prove a new oracle inequality with leading constant one that is valid under a general assumption on linear smoothers allowing, for instance, to compete against the best nearest-neighbor filters.

Cite

Text

Dalalyan and Salmon. "Competing Against the Best Nearest Neighbor Filter in Regression." International Conference on Algorithmic Learning Theory, 2011. doi:10.1007/978-3-642-24412-4_13

Markdown

[Dalalyan and Salmon. "Competing Against the Best Nearest Neighbor Filter in Regression." International Conference on Algorithmic Learning Theory, 2011.](https://mlanthology.org/alt/2011/dalalyan2011alt-competing/) doi:10.1007/978-3-642-24412-4_13

BibTeX

@inproceedings{dalalyan2011alt-competing,
  title     = {{Competing Against the Best Nearest Neighbor Filter in Regression}},
  author    = {Dalalyan, Arnak S. and Salmon, Joseph},
  booktitle = {International Conference on Algorithmic Learning Theory},
  year      = {2011},
  pages     = {129-143},
  doi       = {10.1007/978-3-642-24412-4_13},
  url       = {https://mlanthology.org/alt/2011/dalalyan2011alt-competing/}
}