Language-Conditioned Goal Generation: A New Approach to Language Grounding in RL
Abstract
In the real world, linguistic agents are also embodied agents: they perceive and act in the physical world. The notion of Language Grounding questions the interactions between language and embodiment: how do learning agents connect or ground linguistic representations to the physical world ? This question has recently been approached by the Reinforcement Learning community under the framework of instruction-following agents. In these agents, behavioral policies or reward functions are conditioned on the embedding of an instruction expressed in natural language. This paper proposes another approach: using language to condition goal generators. Given any goal-conditioned policy, one could train a language-conditioned goal generator to generate language-agnostic goals for the agent. This method allows to decouple sensorimotor learning from language acquisition and enable agents to demonstrate a diversity of behaviors for any given instruction. We propose a particular instantiation of this approach and demonstrate its benefits.
Cite
Text
Colas et al. "Language-Conditioned Goal Generation: A New Approach to Language Grounding in RL." ICML 2020 Workshops: LaReL, 2020.Markdown
[Colas et al. "Language-Conditioned Goal Generation: A New Approach to Language Grounding in RL." ICML 2020 Workshops: LaReL, 2020.](https://mlanthology.org/icmlw/2020/colas2020icmlw-languageconditioned/)BibTeX
@inproceedings{colas2020icmlw-languageconditioned,
title = {{Language-Conditioned Goal Generation: A New Approach to Language Grounding in RL}},
author = {Colas, Cédric and Akakzia, Ahmed and Oudeyer, Pierre-Yves and Chetouani, Mohamed and Sigaud, Olivier},
booktitle = {ICML 2020 Workshops: LaReL},
year = {2020},
url = {https://mlanthology.org/icmlw/2020/colas2020icmlw-languageconditioned/}
}