No Pizza for You: Value-Based Plan Selection in BDI Agents

Abstract

Autonomous agents are increasingly required to be able to make moral decisions. In these situations, the agent should be able to reason about the ethical bases of the decision and explain its decision in terms of the moral values involved. This is of special importance when the agent is interacting with a user and should understand the value priorities of the user in order to provide adequate support. This paper presents a model of agent behavior that takes into account user preferences and moral values.

Cite

Text

Cranefield et al. "No Pizza for You: Value-Based Plan Selection in BDI Agents." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/26

Markdown

[Cranefield et al. "No Pizza for You: Value-Based Plan Selection in BDI Agents." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/cranefield2017ijcai-pizza/) doi:10.24963/IJCAI.2017/26

BibTeX

@inproceedings{cranefield2017ijcai-pizza,
  title     = {{No Pizza for You: Value-Based Plan Selection in BDI Agents}},
  author    = {Cranefield, Stephen and Winikoff, Michael and Dignum, Virginia and Dignum, Frank},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {178-184},
  doi       = {10.24963/IJCAI.2017/26},
  url       = {https://mlanthology.org/ijcai/2017/cranefield2017ijcai-pizza/}
}