Unsupervised Template-Assisted Point Cloud Shape Correspondence Network

Abstract

Unsupervised point cloud shape correspondence aims to establish point-wise correspondences between source and target point clouds. Existing methods obtain correspondences directly by computing point-wise feature similarity between point clouds. However non-rigid objects possess strong deformability and unusual shapes making it a longstanding challenge to directly establish correspondences between point clouds with unconventional shapes. To address this challenge we propose an unsupervised Template-Assisted point cloud shape correspondence Network termed TANet including a template generation module and a template assistance module. The proposed TANet enjoys several merits. Firstly the template generation module establishes a set of learnable templates with explicit structures. Secondly we introduce a template assistance module that extensively leverages the generated templates to establish more accurate shape correspondences from multiple perspectives. Extensive experiments on four human and animal datasets demonstrate that TANet achieves favorable performance against state-of-the-art methods.

Cite

Text

Deng et al. "Unsupervised Template-Assisted Point Cloud Shape Correspondence Network." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.00502

Markdown

[Deng et al. "Unsupervised Template-Assisted Point Cloud Shape Correspondence Network." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/deng2024cvpr-unsupervised/) doi:10.1109/CVPR52733.2024.00502

BibTeX

@inproceedings{deng2024cvpr-unsupervised,
  title     = {{Unsupervised Template-Assisted Point Cloud Shape Correspondence Network}},
  author    = {Deng, Jiacheng and Lu, Jiahao and Zhang, Tianzhu},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2024},
  pages     = {5250-5259},
  doi       = {10.1109/CVPR52733.2024.00502},
  url       = {https://mlanthology.org/cvpr/2024/deng2024cvpr-unsupervised/}
}