Learning Portable Representations for High-Level Planning
Abstract
We present a framework for autonomously learning a portable representation that describes a collection of low-level continuous environments. We show that these abstract representations can be learned in a task-independent egocentric space specific to the agent that, when grounded with problem-specific information, are provably sufficient for planning. We demonstrate transfer in two different domains, where an agent learns a portable, task-independent symbolic vocabulary, as well as operators expressed in that vocabulary, and then learns to instantiate those operators on a per-task basis. This reduces the number of samples required to learn a representation of a new task.
Cite
Text
James et al. "Learning Portable Representations for High-Level Planning." International Conference on Machine Learning, 2020.Markdown
[James et al. "Learning Portable Representations for High-Level Planning." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/james2020icml-learning/)BibTeX
@inproceedings{james2020icml-learning,
title = {{Learning Portable Representations for High-Level Planning}},
author = {James, Steven and Rosman, Benjamin and Konidaris, George},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {4682-4691},
volume = {119},
url = {https://mlanthology.org/icml/2020/james2020icml-learning/}
}