Defying the Gravity of Learning Curve: A Characteristic of Nearest Neighbour Anomaly Detectors
Abstract
Conventional wisdom in machine learning says that all algorithms are expected to follow the trajectory of a learning curve which is often colloquially referred to as ‘more data the better’. We call this ‘the gravity of learning curve’, and it is assumed that no learning algorithms are ‘gravity-defiant’. Contrary to the conventional wisdom, this paper provides the theoretical analysis and the empirical evidence that nearest neighbour anomaly detectors are gravity-defiant algorithms.
Cite
Text
Ting et al. "Defying the Gravity of Learning Curve: A Characteristic of Nearest Neighbour Anomaly Detectors." Machine Learning, 2017. doi:10.1007/S10994-016-5586-4Markdown
[Ting et al. "Defying the Gravity of Learning Curve: A Characteristic of Nearest Neighbour Anomaly Detectors." Machine Learning, 2017.](https://mlanthology.org/mlj/2017/ting2017mlj-defying/) doi:10.1007/S10994-016-5586-4BibTeX
@article{ting2017mlj-defying,
title = {{Defying the Gravity of Learning Curve: A Characteristic of Nearest Neighbour Anomaly Detectors}},
author = {Ting, Kai Ming and Washio, Takashi and Wells, Jonathan R. and Aryal, Sunil},
journal = {Machine Learning},
year = {2017},
pages = {55-91},
doi = {10.1007/S10994-016-5586-4},
volume = {106},
url = {https://mlanthology.org/mlj/2017/ting2017mlj-defying/}
}