Cautious Random Forests: A New Decision Strategy and Some Experiments

Abstract

Random forest is an accurate classification strategy, which estimates the posterior probabilities of the classes by averaging frequencies provided by trees. When data are scarce, this estimation becomes difficult. The Imprecise Dirichlet Model can be used to make the estimation robust, providing intervals of probabilities as outputs. Here, we propose a new aggregation strategy based on the theory of belief functions. We also propose to assign weights to the trees according to their amount of uncertainty when classifying a new instance. Our approach is compared experimentally to the baseline approach on several datasets.

Cite

Text

Zhang et al. "Cautious Random Forests: A New Decision Strategy and Some Experiments." Proceedings of the Twelveth International Symposium on Imprecise Probability: Theories and Applications, 2021.

Markdown

[Zhang et al. "Cautious Random Forests: A New Decision Strategy and Some Experiments." Proceedings of the Twelveth International Symposium on Imprecise Probability: Theories and Applications, 2021.](https://mlanthology.org/isipta/2021/zhang2021isipta-cautious/)

BibTeX

@inproceedings{zhang2021isipta-cautious,
  title     = {{Cautious Random Forests: A New Decision Strategy and Some Experiments}},
  author    = {Zhang, Haifei and Quost, Benjamin and Masson, Marie-Hélène},
  booktitle = {Proceedings of the Twelveth International Symposium on Imprecise Probability: Theories and Applications},
  year      = {2021},
  pages     = {369-372},
  volume    = {147},
  url       = {https://mlanthology.org/isipta/2021/zhang2021isipta-cautious/}
}