ARAPReg: An As-Rigid-as Possible Regularization Loss for Learning Deformable Shape Generators

Abstract

This paper introduces an unsupervised loss for training parametric deformation shape generators. The key idea is to enforce the preservation of local rigidity among the generated shapes. Our approach builds on a local approximation of the as-rigid-as possible (or ARAP) deformation energy. We show how to develop the unsupervised loss via a spectral decomposition of the Hessian of the ARAP loss. Our loss nicely decouples pose and shape variations through a robust norm. The loss admits simple closed-form expressions. It is easy to train and can be plugged into any standard generation models, e.g., VAE and GAN. Experimental results show that our approach outperforms existing shape generation approaches considerably across various datasets such as DFAUST, Animal, and Bone.

Cite

Text

Huang et al. "ARAPReg: An As-Rigid-as Possible Regularization Loss for Learning Deformable Shape Generators." International Conference on Computer Vision, 2021. doi:10.1109/ICCV48922.2021.00576

Markdown

[Huang et al. "ARAPReg: An As-Rigid-as Possible Regularization Loss for Learning Deformable Shape Generators." International Conference on Computer Vision, 2021.](https://mlanthology.org/iccv/2021/huang2021iccv-arapreg/) doi:10.1109/ICCV48922.2021.00576

BibTeX

@inproceedings{huang2021iccv-arapreg,
  title     = {{ARAPReg: An As-Rigid-as Possible Regularization Loss for Learning Deformable Shape Generators}},
  author    = {Huang, Qixing and Huang, Xiangru and Sun, Bo and Zhang, Zaiwei and Jiang, Junfeng and Bajaj, Chandrajit},
  booktitle = {International Conference on Computer Vision},
  year      = {2021},
  pages     = {5815-5825},
  doi       = {10.1109/ICCV48922.2021.00576},
  url       = {https://mlanthology.org/iccv/2021/huang2021iccv-arapreg/}
}