Model-Based Planning with Energy-Based Models
Abstract
Model-based planning holds great promise for improving both sample efficiency and generalization in reinforcement learning (RL). We show that energy-based models (EBMs) are a promising class of models to use for model-based planning. EBMs naturally support inference of intermediate states given start and goal state distributions. We provide an online algorithm to train EBMs while interacting with the environment, and show that EBMs allow for significantly better online learning than corresponding feed-forward networks. We further show that EBMs support maximum entropy state inference and are able to generate diverse state space plans. We show that inference purely in state space - without planning actions - allows for better generalization to previously unseen obstacles in the environment and prevents the planner from exploiting the dynamics model by applying uncharacteristic action sequences. Finally, we show that online EBM training naturally leads to intentionally planned state exploration which performs significantly better than random exploration.
Cite
Text
Du et al. "Model-Based Planning with Energy-Based Models." Conference on Robot Learning, 2019.Markdown
[Du et al. "Model-Based Planning with Energy-Based Models." Conference on Robot Learning, 2019.](https://mlanthology.org/corl/2019/du2019corl-modelbased/)BibTeX
@inproceedings{du2019corl-modelbased,
title = {{Model-Based Planning with Energy-Based Models}},
author = {Du, Yilun and Lin, Toru and Mordatch, Igor},
booktitle = {Conference on Robot Learning},
year = {2019},
pages = {374-383},
volume = {100},
url = {https://mlanthology.org/corl/2019/du2019corl-modelbased/}
}