AnySkin: Plug-and-Play Skin Sensing for Robotic Touch

Abstract

While tactile sensing is widely accepted as an important and useful sensing modality, its use pales in comparison to other sensory modalities like vision and proprioception. AnySkin addresses the critical challenges that impede the use of tactile sensing -- versatility, replaceability, and data reusability. Building on the simple design of ReSkin, and decoupling the sensing electronics from the sensing interface, AnySkin makes integration as straightforward as putting on a phone case and connecting a charger. Furthermore, AnySkin is the first uncalibrated tactile-sensor to report cross-instance generalizability of learned manipulation policies. To summarize, this work makes three key contributions: first, we introduce a streamlined fabrication process and a design tool for creating an adhesive-free, durable and easily replaceable magnetic tactile sensor; second, we characterize slip detection and policy learning with the AnySkin sensor; third, we demonstrate zero-shot generalization of models trained on one instance of AnySkin to new instances, and compare it with popular existing tactile solutions like DIGIT and ReSkin. Videos and more details can be found on our https://anon-anyskin.github.io.

Cite

Text

Bhirangi et al. "AnySkin: Plug-and-Play Skin Sensing for Robotic Touch." NeurIPS 2024 Workshops: WTP, 2024.

Markdown

[Bhirangi et al. "AnySkin: Plug-and-Play Skin Sensing for Robotic Touch." NeurIPS 2024 Workshops: WTP, 2024.](https://mlanthology.org/neuripsw/2024/bhirangi2024neuripsw-anyskin/)

BibTeX

@inproceedings{bhirangi2024neuripsw-anyskin,
  title     = {{AnySkin: Plug-and-Play Skin Sensing for Robotic Touch}},
  author    = {Bhirangi, Raunaq and Pattabiraman, Venkatesh and Erciyes, Mehmet Enes and Cao, Yifeng and Hellebrekers, Tess and Pinto, Lerrel},
  booktitle = {NeurIPS 2024 Workshops: WTP},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/bhirangi2024neuripsw-anyskin/}
}