Neural Agents Struggle to Take Turns in Bidirectional Emergent Communication

Abstract

The spontaneous exchange of turns is a central aspect of human communication. Although turn-taking conventions come to us naturally, artificial dialogue agents struggle to coordinate, and must rely on hard-coded rules to engage in interactive conversations with human interlocutors. In this paper, we investigate the conditions under which artificial agents may naturally develop turn-taking conventions in a simple language game. We describe a cooperative task where success is contingent on the exchange of information along a shared communication channel where talking over each other hinders communication. Despite these environmental constraints, neural-network based agents trained to solve this task with reinforcement learning do not systematically adopt turn-taking conventions. However, we find that agents that do agree on turn-taking protocols end up performing better. Moreover, agents that are forced to perform turn-taking can learn to solve the task more quickly. This suggests that turn-taking may help to generate conversations that are easier for speakers to interpret.

Cite

Text

Taillandier et al. "Neural Agents Struggle to Take Turns in Bidirectional Emergent Communication." International Conference on Learning Representations, 2023.

Markdown

[Taillandier et al. "Neural Agents Struggle to Take Turns in Bidirectional Emergent Communication." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/taillandier2023iclr-neural/)

BibTeX

@inproceedings{taillandier2023iclr-neural,
  title     = {{Neural Agents Struggle to Take Turns in Bidirectional Emergent Communication}},
  author    = {Taillandier, Valentin and Hupkes, Dieuwke and Sagot, Benoît and Dupoux, Emmanuel and Michel, Paul},
  booktitle = {International Conference on Learning Representations},
  year      = {2023},
  url       = {https://mlanthology.org/iclr/2023/taillandier2023iclr-neural/}
}