Bayesian Differentiable Physics for Cloth Digitalization

Abstract

We propose a new method for cloth digitalization. Deviating from existing methods which learn from data captured under relatively casual settings we propose to learn from data captured in strictly tested measuring protocols and find plausible physical parameters of the cloths. However such data is currently absent so we first propose a new dataset with accurate cloth measurements. Further the data size is considerably smaller than the ones in current deep learning due to the nature of the data capture process. To learn from small data we propose a new Bayesian differentiable cloth model to estimate the complex material heterogeneity of real cloths. It can provide highly accurate digitalization from very limited data samples. Through exhaustive evaluation and comparison we show our method is accurate in cloth digitalization efficient in learning from limited data samples and general in capturing material variations. Code and data are available in: https://github.com/realcrane/Bayesian-Differentiable-Physics-for-Cloth-Digitalization

Cite

Text

Gong et al. "Bayesian Differentiable Physics for Cloth Digitalization." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.01125

Markdown

[Gong et al. "Bayesian Differentiable Physics for Cloth Digitalization." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/gong2024cvpr-bayesian/) doi:10.1109/CVPR52733.2024.01125

BibTeX

@inproceedings{gong2024cvpr-bayesian,
  title     = {{Bayesian Differentiable Physics for Cloth Digitalization}},
  author    = {Gong, Deshan and Mao, Ningtao and Wang, He},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2024},
  pages     = {11841-11851},
  doi       = {10.1109/CVPR52733.2024.01125},
  url       = {https://mlanthology.org/cvpr/2024/gong2024cvpr-bayesian/}
}