From Global to Local MDI Variable Importances for Random Forests and When They Are Shapley Values
Abstract
Random forests have been widely used for their ability to provide so-called importance measures, which give insight at a global (per dataset) level on the relevance of input variables to predict a certain output. On the other hand, methods based on Shapley values have been introduced to refine the analysis of feature relevance in tree-based models to a local (per instance) level. In this context, we first show that the global Mean Decrease of Impurity (MDI) variable importance scores correspond to Shapley values under some conditions. Then, we derive a local MDI importance measure of variable relevance, which has a very natural connection with the global MDI measure and can be related to a new notion of local feature relevance. We further link local MDI importances with Shapley values and discuss them in the light of related measures from the literature. The measures are illustrated through experiments on several classification and regression problems.
Cite
Text
Sutera et al. "From Global to Local MDI Variable Importances for Random Forests and When They Are Shapley Values." Neural Information Processing Systems, 2021.Markdown
[Sutera et al. "From Global to Local MDI Variable Importances for Random Forests and When They Are Shapley Values." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/sutera2021neurips-global/)BibTeX
@inproceedings{sutera2021neurips-global,
title = {{From Global to Local MDI Variable Importances for Random Forests and When They Are Shapley Values}},
author = {Sutera, Antonio and Louppe, Gilles and Huynh-Thu, Van Anh and Wehenkel, Louis and Geurts, Pierre},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/sutera2021neurips-global/}
}