Modularity Through Attention: Efficient Training and Transfer of Language-Conditioned Policies for Robot Manipulation
Abstract
Language-conditioned policies allow robots to interpret and execute human instructions. Learning such policies requires a substantial investment with regards to time and compute resources. Still, the resulting controllers are highly device-specific and cannot easily be transferred to a robot with different morphology, capability, appearance or dynamics. In this paper, we propose a sample-efficient approach for training language-conditioned manipulation policies that allows for rapid transfer across different types of robots. By introducing a novel method, namely Hierarchical Modularity, and adopting supervised attention across multiple sub-modules, we bridge the divide between modular and end-to-end learning and enable the reuse of functional building blocks. In both simulated and real world robot manipulation experiments, we demonstrate that our method outperforms the current state-of-the-art methods and can transfer policies across 4 different robots in a sample-efficient manner. Finally, we show that the functionality of learned sub-modules is maintained beyond the training process and can be used to introspect the robot decision-making process.
Cite
Text
Zhou et al. "Modularity Through Attention: Efficient Training and Transfer of Language-Conditioned Policies for Robot Manipulation." Conference on Robot Learning, 2022.Markdown
[Zhou et al. "Modularity Through Attention: Efficient Training and Transfer of Language-Conditioned Policies for Robot Manipulation." Conference on Robot Learning, 2022.](https://mlanthology.org/corl/2022/zhou2022corl-modularity/)BibTeX
@inproceedings{zhou2022corl-modularity,
title = {{Modularity Through Attention: Efficient Training and Transfer of Language-Conditioned Policies for Robot Manipulation}},
author = {Zhou, Yifan and Sonawani, Shubham and Phielipp, Mariano and Stepputtis, Simon and Amor, Heni},
booktitle = {Conference on Robot Learning},
year = {2022},
pages = {1684-1695},
volume = {205},
url = {https://mlanthology.org/corl/2022/zhou2022corl-modularity/}
}