Visual Memory for Robust Path Following

Abstract

Humans routinely retrace a path in a novel environment both forwards and backwards despite uncertainty in their motion. In this paper, we present an approach for doing so. Given a demonstration of a path, a first network generates an abstraction of the path. Equipped with this abstraction, a second network then observes the world and decides how to act in order to retrace the path under noisy actuation and a changing environment. The two networks are optimized end-to-end at training time. We evaluate the method in two realistic simulators, performing path following both forwards and backwards. Our experiments show that our approach outperforms both a classical approach to solving this task as well as a number of other baselines.

Cite

Text

Kumar et al. "Visual Memory for Robust Path Following." Neural Information Processing Systems, 2018.

Markdown

[Kumar et al. "Visual Memory for Robust Path Following." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/kumar2018neurips-visual/)

BibTeX

@inproceedings{kumar2018neurips-visual,
  title     = {{Visual Memory for Robust Path Following}},
  author    = {Kumar, Ashish and Gupta, Saurabh and Fouhey, David and Levine, Sergey and Malik, Jitendra},
  booktitle = {Neural Information Processing Systems},
  year      = {2018},
  pages     = {765-774},
  url       = {https://mlanthology.org/neurips/2018/kumar2018neurips-visual/}
}