Towards Explaining Image-Based Distribution Shifts
Abstract
Distribution shift can have fundamental consequences such as signaling a change in the operating environment or significantly reducing the accuracy of downstream models. Thus, understanding such distribution shifts is critical for examining and hopefully mitigating the effect of such a shift. Most prior work has focused on either natively handling distribution shift (e.g., Domain Generalization) or merely detecting a shift while assuming any detected shift can be understood and handled appropriately by a human operator. For the latter, we hope to aid in these manual mitigation tasks by explaining the distribution shift to an operator. To this end, we suggest two methods: providing a set of interpretable mappings from the original distribution to the shifted one or providing a set of distributional counterfactual examples. We provide preliminary experiments on these two methods, and discuss important concepts and challenges for moving towards a better understanding of image-based distribution shifts.
Cite
Text
Kulinski and Inouye. "Towards Explaining Image-Based Distribution Shifts." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2022. doi:10.1109/CVPRW56347.2022.00525Markdown
[Kulinski and Inouye. "Towards Explaining Image-Based Distribution Shifts." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2022.](https://mlanthology.org/cvprw/2022/kulinski2022cvprw-explaining/) doi:10.1109/CVPRW56347.2022.00525BibTeX
@inproceedings{kulinski2022cvprw-explaining,
title = {{Towards Explaining Image-Based Distribution Shifts}},
author = {Kulinski, Sean and Inouye, David I.},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2022},
pages = {4787-4791},
doi = {10.1109/CVPRW56347.2022.00525},
url = {https://mlanthology.org/cvprw/2022/kulinski2022cvprw-explaining/}
}