Continual Adaptation for Efficient Machine Communication
Abstract
To communicate with new partners in new contexts, humans rapidly form new linguistic conventions. Recent language models trained with deep neural networks are able to comprehend and produce the existing conventions present in their training data, but are not able to flexibly and interactively adapt those conventions on the fly as humans do. We introduce a repeated reference task as a benchmark for models of adaptation in communication and propose a regularized continual learning framework that allows an artificial agent initialized with a generic language model to more accurately and efficiently understand their partner over time. We evaluate this framework through simulations on COCO and in real-time reference game experiments with human partners.
Cite
Text
Hawkins et al. "Continual Adaptation for Efficient Machine Communication." ICML 2019 Workshops: AMTL, 2019.Markdown
[Hawkins et al. "Continual Adaptation for Efficient Machine Communication." ICML 2019 Workshops: AMTL, 2019.](https://mlanthology.org/icmlw/2019/hawkins2019icmlw-continual/)BibTeX
@inproceedings{hawkins2019icmlw-continual,
title = {{Continual Adaptation for Efficient Machine Communication}},
author = {Hawkins, Robert and Kwon, Minae and Sadigh, Dorsa and Goodman, Noah},
booktitle = {ICML 2019 Workshops: AMTL},
year = {2019},
url = {https://mlanthology.org/icmlw/2019/hawkins2019icmlw-continual/}
}