Explaining Local, Global, and Higher-Order Interactions in Deep Learning
Abstract
We present a simple yet highly generalizable method for explaining interacting parts within a neural network's reasoning process. First, we design an algorithm based on cross derivatives for computing statistical interaction effects between individual features, which is generalized to both 2-way and higher-order (3-way or more) interactions. We present results side by side with a weight-based attribution technique, corroborating that cross derivatives are a superior metric for both 2-way and higher-order interaction detection. Moreover, we extend the use of cross derivatives as an explanatory device in neural networks to the computer vision setting by expanding Grad-CAM, a popular gradient-based explanatory tool for CNNs, to the higher order. While Grad-CAM can only explain the importance of individual objects in images, our method, which we call Taylor-CAM, can explain a neural network's relational reasoning across multiple objects. We show the success of our explanations both qualitatively and quantitatively, including with a user study. We will release all code as a tool package to facilitate explainable deep learning.
Cite
Text
Lerman et al. "Explaining Local, Global, and Higher-Order Interactions in Deep Learning." International Conference on Computer Vision, 2021. doi:10.1109/ICCV48922.2021.00126Markdown
[Lerman et al. "Explaining Local, Global, and Higher-Order Interactions in Deep Learning." International Conference on Computer Vision, 2021.](https://mlanthology.org/iccv/2021/lerman2021iccv-explaining/) doi:10.1109/ICCV48922.2021.00126BibTeX
@inproceedings{lerman2021iccv-explaining,
title = {{Explaining Local, Global, and Higher-Order Interactions in Deep Learning}},
author = {Lerman, Samuel and Venuto, Charles and Kautz, Henry and Xu, Chenliang},
booktitle = {International Conference on Computer Vision},
year = {2021},
pages = {1224-1233},
doi = {10.1109/ICCV48922.2021.00126},
url = {https://mlanthology.org/iccv/2021/lerman2021iccv-explaining/}
}