Neural Program Synthesis from Diverse Demonstration Videos

Abstract

Interpreting decision making logic in demonstration videos is key to collaborating with and mimicking humans. To empower machines with this ability, we propose a neural program synthesizer that is able to explicitly synthesize underlying programs from behaviorally diverse and visually complicated demonstration videos. We introduce a summarizer module as part of our model to improve the network’s ability to integrate multiple demonstrations varying in behavior. We also employ a multi-task objective to encourage the model to learn meaningful intermediate representations for end-to-end training. We show that our model is able to reliably synthesize underlying programs as well as capture diverse behaviors exhibited in demonstrations. The code is available at https://shaohua0116.github.io/demo2program.

Cite

Text

Sun et al. "Neural Program Synthesis from Diverse Demonstration Videos." International Conference on Machine Learning, 2018.

Markdown

[Sun et al. "Neural Program Synthesis from Diverse Demonstration Videos." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/sun2018icml-neural/)

BibTeX

@inproceedings{sun2018icml-neural,
  title     = {{Neural Program Synthesis from Diverse Demonstration Videos}},
  author    = {Sun, Shao-Hua and Noh, Hyeonwoo and Somasundaram, Sriram and Lim, Joseph},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {4790-4799},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/sun2018icml-neural/}
}