Feature Importance Measurement Based on Decision Tree Sampling

Abstract

Random forest is effective for prediction tasks but the randomness of tree generation hinders interpretability in feature importance analysis. To address this, we proposed a SAT-based method for measuring feature importance in tree-based model. Our method has fewer parameters than random forest and provides higher interpretability and stability for the analysis in real-world problems.

Cite

Text

Huang et al. "Feature Importance Measurement Based on Decision Tree Sampling." ICML 2023 Workshops: IMLH, 2023.

Markdown

[Huang et al. "Feature Importance Measurement Based on Decision Tree Sampling." ICML 2023 Workshops: IMLH, 2023.](https://mlanthology.org/icmlw/2023/huang2023icmlw-feature/)

BibTeX

@inproceedings{huang2023icmlw-feature,
  title     = {{Feature Importance Measurement Based on Decision Tree Sampling}},
  author    = {Huang, Chao and Das, Diptesh and Tsuda, Koji},
  booktitle = {ICML 2023 Workshops: IMLH},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/huang2023icmlw-feature/}
}