A Sim2Real Approach for Identifying Task-Relevant Properties in Interpretable Machine Learning
Abstract
Existing user studies suggest that different tasks may require explanations with different properties. However, user studies are expensive. In this paper, we introduce XAIsim2real, a generalizable, cost-effective method for identifying task-relevant explanation properties in silico, which can guide the design of more expensive user studies. We use XAIsim2real to identify relevant proxies for three example tasks and validate our simulation with real user studies.
Cite
Text
Nofshin et al. "A Sim2Real Approach for Identifying Task-Relevant Properties in Interpretable Machine Learning." ICML 2024 Workshops: NextGenAISafety, 2024.Markdown
[Nofshin et al. "A Sim2Real Approach for Identifying Task-Relevant Properties in Interpretable Machine Learning." ICML 2024 Workshops: NextGenAISafety, 2024.](https://mlanthology.org/icmlw/2024/nofshin2024icmlw-sim2real/)BibTeX
@inproceedings{nofshin2024icmlw-sim2real,
title = {{A Sim2Real Approach for Identifying Task-Relevant Properties in Interpretable Machine Learning}},
author = {Nofshin, Eura and Brown, Esther and Lim, Brian and Pan, Weiwei and Doshi-Velez, Finale},
booktitle = {ICML 2024 Workshops: NextGenAISafety},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/nofshin2024icmlw-sim2real/}
}