One Class Splitting Criteria for Random Forests

Abstract

Random Forests (RFs) are strong machine learning tools for classification and regression. However, they remain supervised algorithms, and no extension of RFs to the one-class setting has been proposed, except for techniques based on second-class sampling. This work fills this gap by proposing a natural methodology to extend standard splitting criteria to the one-class setting, structurally generalizing RFs to one-class classification. An extensive benchmark of seven state-of-the-art anomaly detection algorithms is also presented. This empirically demonstrates the relevance of our approach.

Cite

Text

Goix et al. "One Class Splitting Criteria for Random Forests." Proceedings of the Ninth Asian Conference on Machine Learning, 2017.

Markdown

[Goix et al. "One Class Splitting Criteria for Random Forests." Proceedings of the Ninth Asian Conference on Machine Learning, 2017.](https://mlanthology.org/acml/2017/goix2017acml-one/)

BibTeX

@inproceedings{goix2017acml-one,
  title     = {{One Class Splitting Criteria for Random Forests}},
  author    = {Goix, Nicolas and Drougard, Nicolas and Brault, Romain and Chiapino, Mael},
  booktitle = {Proceedings of the Ninth Asian Conference on Machine Learning},
  year      = {2017},
  pages     = {343-358},
  volume    = {77},
  url       = {https://mlanthology.org/acml/2017/goix2017acml-one/}
}