Fast Time Series Classification Using Numerosity Reduction

Abstract

Many algorithms have been proposed for the problem of time series classification. However, it is clear that one-nearest-neighbor with Dynamic Time Warping (DTW) distance is exceptionally difficult to beat. This approach has one weakness, however; it is computationally too demanding for many realtime applications. One way to mitigate this problem is to speed up the DTW calculations. Nonetheless, there is a limit to how much this can help. In this work, we propose an additional technique, numerosity reduction, to speed up one-nearest-neighbor DTW. While the idea of numerosity reduction for nearest-neighbor classifiers has a long history, we show here that we can leverage off an original observation about the relationship between dataset size and DTW constraints to produce an extremely compact dataset with little or no loss in accuracy. We test our ideas with a comprehensive set of experiments, and show that it can efficiently produce extremely fast accurate classifiers.

Cite

Text

Xi et al. "Fast Time Series Classification Using Numerosity Reduction." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143974

Markdown

[Xi et al. "Fast Time Series Classification Using Numerosity Reduction." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/xi2006icml-fast/) doi:10.1145/1143844.1143974

BibTeX

@inproceedings{xi2006icml-fast,
  title     = {{Fast Time Series Classification Using Numerosity Reduction}},
  author    = {Xi, Xiaopeng and Keogh, Eamonn J. and Shelton, Christian R. and Wei, Li and Ratanamahatana, Chotirat Ann},
  booktitle = {International Conference on Machine Learning},
  year      = {2006},
  pages     = {1033-1040},
  doi       = {10.1145/1143844.1143974},
  url       = {https://mlanthology.org/icml/2006/xi2006icml-fast/}
}