AI for Explaining Decisions in Multi-Agent Environments
Abstract
Explanation is necessary for humans to understand and accept decisions made by an AI system when the system's goal is known. It is even more important when the AI system makes decisions in multi-agent environments where the human does not know the systems' goals since they may depend on other agents' preferences. In such situations, explanations should aim to increase user satisfaction, taking into account the system's decision, the user's and the other agents' preferences, the environment settings and properties such as fairness, envy and privacy. Generating explanations that will increase user satisfaction is very challenging; to this end, we propose a new research direction: xMASE. We then review the state of the art and discuss research directions towards efficient methodologies and algorithms for generating explanations that will increase users' satisfaction from AI system's decisions in multi-agent environments.
Cite
Text
Kraus et al. "AI for Explaining Decisions in Multi-Agent Environments." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I09.7077Markdown
[Kraus et al. "AI for Explaining Decisions in Multi-Agent Environments." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/kraus2020aaai-ai/) doi:10.1609/AAAI.V34I09.7077BibTeX
@inproceedings{kraus2020aaai-ai,
title = {{AI for Explaining Decisions in Multi-Agent Environments}},
author = {Kraus, Sarit and Azaria, Amos and Fiosina, Jelena and Greve, Maike and Hazon, Noam and Kolbe, Lutz M. and Lembcke, Tim-Benjamin and Müller, Jörg P. and Schleibaum, Sören and Vollrath, Mark},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {13534-13538},
doi = {10.1609/AAAI.V34I09.7077},
url = {https://mlanthology.org/aaai/2020/kraus2020aaai-ai/}
}