Lucid-XR: An Extended-Reality Data Engine for Robotic Manipulation

Abstract

We introduce Lucid-XR, a generative data engine for creating diverse and realistic-looking data to train real-world robot systems. At the core of Lucid-XR is vuer, a web-based physics simulation environment that runs directly on the XR headset, enabling internet-scale access to immersive, latency-free virtual interactions without requiring specialized equipment. The complete system integrates on-device physics simulation with on-device human-to-robot pose retargeting, that are further amplified by a physics-guided video generation pipeline commandable with natural language specifications. We demonstrate zero-shot sim-to-real transfer of robot visual policies, trained entirely on Lucid-XR’s synthetic data, across bimanual and dexterous manipulation tasks that involve flexible materials, adhesive interaction between particles, and rigid body contact.

Cite

Text

Ravan et al. "Lucid-XR: An Extended-Reality Data Engine for Robotic Manipulation." Proceedings of The 9th Conference on Robot Learning, 2025.

Markdown

[Ravan et al. "Lucid-XR: An Extended-Reality Data Engine for Robotic Manipulation." Proceedings of The 9th Conference on Robot Learning, 2025.](https://mlanthology.org/corl/2025/ravan2025corl-lucidxr/)

BibTeX

@inproceedings{ravan2025corl-lucidxr,
  title     = {{Lucid-XR: An Extended-Reality Data Engine for Robotic Manipulation}},
  author    = {Ravan, Yajvan and Rashid, Adam and Yu, Alan and McClennen, Kai and Huh, Gio and Yang, Kevin and Yang, Zhutian and Yu, Qinxi and Wang, Xiaolong and Isola, Phillip and Yang, Ge},
  booktitle = {Proceedings of The 9th Conference on Robot Learning},
  year      = {2025},
  pages     = {5151-5169},
  volume    = {305},
  url       = {https://mlanthology.org/corl/2025/ravan2025corl-lucidxr/}
}