Robust 3D Shape Classification via Non-Local Graph Attention Network

Abstract

We introduce a non-local graph attention network (NLGAT), which generates a novel global descriptor through two sub-networks for robust 3D shape classification. In the first sub-network, we capture the global relationships between points (i.e., point-point features) by designing a global relationship network (GRN). In the second sub-network, we enhance the local features with a geometric shape attention map obtained from a global structure network (GSN). To keep rotation invariant and extract more information from sparse point clouds, all sub-networks use the Gram matrices with different dimensions as input for working with robust classification. Additionally, GRN effectively preserves the low-frequency features and improves the classification results. Experimental results on various datasets exhibit that the classification effect of the NLGAT model is better than other state-of-the-art models. Especially, in the case of sparse point clouds (64 points) with noise under arbitrary SO(3) rotation, the classification result (85.4%) of NLGAT is improved by 39.4% compared with the best development of other methods.

Cite

Text

Qin et al. "Robust 3D Shape Classification via Non-Local Graph Attention Network." Conference on Computer Vision and Pattern Recognition, 2023. doi:10.1109/CVPR52729.2023.00520

Markdown

[Qin et al. "Robust 3D Shape Classification via Non-Local Graph Attention Network." Conference on Computer Vision and Pattern Recognition, 2023.](https://mlanthology.org/cvpr/2023/qin2023cvpr-robust/) doi:10.1109/CVPR52729.2023.00520

BibTeX

@inproceedings{qin2023cvpr-robust,
  title     = {{Robust 3D Shape Classification via Non-Local Graph Attention Network}},
  author    = {Qin, Shengwei and Li, Zhong and Liu, Ligang},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2023},
  pages     = {5374-5383},
  doi       = {10.1109/CVPR52729.2023.00520},
  url       = {https://mlanthology.org/cvpr/2023/qin2023cvpr-robust/}
}