Deep Forest: Towards an Alternative to Deep Neural Networks

Abstract

In this paper, we propose gcForest, a decision tree ensemble approach with performance highly competitive to deep neural networks in a broad range of tasks. In contrast to deep neural networks which require great effort in hyper-parameter tuning, gcForest is much easier to train; even when it is applied to different data across different domains in our experiments, excellent performance can be achieved by almost same settings of hyper-parameters. The training process of gcForest is efficient, and users can control training cost according to computational resource available. The efficiency may be further enhanced because gcForest is naturally apt to parallel implementation. Furthermore, in contrast to deep neural networks which require large-scale training data, gcForest can work well even when there are only small-scale training data.

Cite

Text

Zhou and Feng. "Deep Forest: Towards an Alternative to Deep Neural Networks." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/497

Markdown

[Zhou and Feng. "Deep Forest: Towards an Alternative to Deep Neural Networks." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/zhou2017ijcai-deep/) doi:10.24963/IJCAI.2017/497

BibTeX

@inproceedings{zhou2017ijcai-deep,
  title     = {{Deep Forest: Towards an Alternative to Deep Neural Networks}},
  author    = {Zhou, Zhi-Hua and Feng, Ji},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {3553-3559},
  doi       = {10.24963/IJCAI.2017/497},
  url       = {https://mlanthology.org/ijcai/2017/zhou2017ijcai-deep/}
}