Hierarchical Visuomotor Control of Humanoids
Abstract
We aim to build complex humanoid agents that integrate perception, motor control, and memory. In this work, we partly factor this problem into low-level motor control from proprioception and high-level coordination of the low-level skills informed by vision. We develop an architecture capable of surprisingly flexible, task-directed motor control of a relatively high-DoF humanoid body by combining pre-training of low-level motor controllers with a high-level, task-focused controller that switches among low-level sub-policies. The resulting system is able to control a physically-simulated humanoid body to solve tasks that require coupling visual perception from an unstabilized egocentric RGB camera during locomotion in the environment. Supplementary video link: https://youtu.be/fBoir7PNxPk
Cite
Text
Merel et al. "Hierarchical Visuomotor Control of Humanoids." International Conference on Learning Representations, 2019.Markdown
[Merel et al. "Hierarchical Visuomotor Control of Humanoids." International Conference on Learning Representations, 2019.](https://mlanthology.org/iclr/2019/merel2019iclr-hierarchical/)BibTeX
@inproceedings{merel2019iclr-hierarchical,
title = {{Hierarchical Visuomotor Control of Humanoids}},
author = {Merel, Josh and Ahuja, Arun and Pham, Vu and Tunyasuvunakool, Saran and Liu, Siqi and Tirumala, Dhruva and Heess, Nicolas and Wayne, Greg},
booktitle = {International Conference on Learning Representations},
year = {2019},
url = {https://mlanthology.org/iclr/2019/merel2019iclr-hierarchical/}
}