DexCatch: Learning to Catch Arbitrary Objects with Dexterous Hands

Abstract

Achieving human-like dexterous manipulation remains a crucial area of research in robotics. Current research focuses on improving the success rate of pick-and-place tasks. Compared with pick-and-place, throwing-catching behavior has the potential to increase the speed of transporting objects to their destination. However, dynamic dexterous manipulation poses a major challenge for stable control due to a large number of dynamic contacts. In this paper, we propose a Learning-based framework for Throwing-Catching tasks using dexterous hands (LTC). Our method, LTC, achieves a 73% success rate across 45 scenarios (diverse hand poses and objects), and the learned policies demonstrate strong zero-shot transfer performance on unseen objects. Additionally, in tasks where the object in hand faces sideways, an extremely unstable scenario due to the lack of support from the palm, all baselines fail, while our method still achieves a success rate of over 60%.

Cite

Text

Lan et al. "DexCatch: Learning to Catch Arbitrary Objects with Dexterous Hands." Proceedings of The 8th Conference on Robot Learning, 2024.

Markdown

[Lan et al. "DexCatch: Learning to Catch Arbitrary Objects with Dexterous Hands." Proceedings of The 8th Conference on Robot Learning, 2024.](https://mlanthology.org/corl/2024/lan2024corl-dexcatch/)

BibTeX

@inproceedings{lan2024corl-dexcatch,
  title     = {{DexCatch: Learning to Catch Arbitrary Objects with Dexterous Hands}},
  author    = {Lan, Fengbo and Wang, Shengjie and Zhang, Yunzhe and Xu, Haotian and Oseni, Oluwatosin OluwaPelumi and Zhang, Ziye and Gao, Yang and Zhang, Tao},
  booktitle = {Proceedings of The 8th Conference on Robot Learning},
  year      = {2024},
  pages     = {2965-2981},
  volume    = {270},
  url       = {https://mlanthology.org/corl/2024/lan2024corl-dexcatch/}
}