Computing Abductive Explanations for Boosted Regression Trees

Abstract

We present two algorithms for generating (resp. evaluating) abductive explanations for boosted regression trees. Given an instance x and an interval I containing its value F (x) for the boosted regression tree F at hand, the generation algorithm returns a (most general) term t over the Boolean conditions in F such that every instance x′ satisfying t is such that F (x′ ) ∈ I. The evaluation algorithm tackles the corresponding inverse problem: given F , x and a term t over the Boolean conditions in F such that t covers x, find the least interval I_t such that for every instance x′ covered by t we have F (x′ ) ∈ I_t . Experiments on various datasets show that the two algorithms are practical enough to be used for generating (resp. evaluating) abductive explanations for boosted regression trees based on a large number of Boolean conditions.

Cite

Text

Audemard et al. "Computing Abductive Explanations for Boosted Regression Trees." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/382

Markdown

[Audemard et al. "Computing Abductive Explanations for Boosted Regression Trees." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/audemard2023ijcai-computing/) doi:10.24963/IJCAI.2023/382

BibTeX

@inproceedings{audemard2023ijcai-computing,
  title     = {{Computing Abductive Explanations for Boosted Regression Trees}},
  author    = {Audemard, Gilles and Bellart, Steve and Lagniez, Jean-Marie and Marquis, Pierre},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {3432-3441},
  doi       = {10.24963/IJCAI.2023/382},
  url       = {https://mlanthology.org/ijcai/2023/audemard2023ijcai-computing/}
}