AugBoost: Gradient Boosting Enhanced with Step-Wise Feature Augmentation

Abstract

Gradient Boosted Decision Trees (GBDT) is a widely used machine learning algorithm, which obtains state-of-the-art results on many machine learning tasks. In this paper we introduce a method for obtaining better results, by augmenting the features in the dataset between the iterations of GBDT. We explore a number of augmentation methods: training an Artificial Neural Network (ANN) and extracting features from it's last hidden layer (supervised), and rotating the feature-space using unsupervised methods such as PCA or Random Projection (RP). These variations on GBDT were tested on 20 classification tasks, on which all of them outperformed GBDT and previous related work.

Cite

Text

Tannor and Rokach. "AugBoost: Gradient Boosting Enhanced with Step-Wise Feature Augmentation." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/493

Markdown

[Tannor and Rokach. "AugBoost: Gradient Boosting Enhanced with Step-Wise Feature Augmentation." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/tannor2019ijcai-augboost/) doi:10.24963/IJCAI.2019/493

BibTeX

@inproceedings{tannor2019ijcai-augboost,
  title     = {{AugBoost: Gradient Boosting Enhanced with Step-Wise Feature Augmentation}},
  author    = {Tannor, Philip and Rokach, Lior},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {3555-3561},
  doi       = {10.24963/IJCAI.2019/493},
  url       = {https://mlanthology.org/ijcai/2019/tannor2019ijcai-augboost/}
}