PSForest: Improving Deep Forest via Feature Pooling and Error Screening

Abstract

In recent years, most of the research on deep learning is based on deep neural networks, which uses the backpropagation algorithm to train parameters of nonlinear layers. Recently, a non-NN style deep model called Deep Forest or gcForest was proposed by Zhou and Feng, which is a deep learning model based on random forests and the training process does not rely on backpropagation. In this paper, we propose PSForest, which can be regarded as a modification of the standard Deep Forest. The main idea for improving the efficiency and performance of the Deep Forest is to do multi-grained pooling of raw features and screening the class vector of each layer based on out-of-bag error. The experiment on different datasets shows that our proposed model achieves predictive accuracy comparable to or better than gcForest, with lower memory requirement and smaller time cost. The study significantly improves the competitiveness of deep forests, further demonstrating that deep learning is more than just deep neural networks.

Cite

Text

Ni and Kao. "PSForest: Improving Deep Forest via Feature Pooling and Error Screening." Proceedings of The 12th Asian Conference on Machine Learning, 2020.

Markdown

[Ni and Kao. "PSForest: Improving Deep Forest via Feature Pooling and Error Screening." Proceedings of The 12th Asian Conference on Machine Learning, 2020.](https://mlanthology.org/acml/2020/ni2020acml-psforest/)

BibTeX

@inproceedings{ni2020acml-psforest,
  title     = {{PSForest: Improving Deep Forest via Feature Pooling and Error Screening}},
  author    = {Ni, Shiwen and Kao, Hung-Yu},
  booktitle = {Proceedings of The 12th Asian Conference on Machine Learning},
  year      = {2020},
  pages     = {769-781},
  volume    = {129},
  url       = {https://mlanthology.org/acml/2020/ni2020acml-psforest/}
}