PointGPT: Auto-Regressively Generative Pre-Training from Point Clouds
Abstract
Large language models (LLMs) based on the generative pre-training transformer (GPT) have demonstrated remarkable effectiveness across a diverse range of downstream tasks. Inspired by the advancements of the GPT, we present PointGPT, a novel approach that extends the concept of GPT to point clouds, addressing the challenges associated with disorder properties, low information density, and task gaps. Specifically, a point cloud auto-regressive generation task is proposed to pre-train transformer models. Our method partitions the input point cloud into multiple point patches and arranges them in an ordered sequence based on their spatial proximity. Then, an extractor-generator based transformer decode, with a dual masking strategy, learns latent representations conditioned on the preceding point patches, aiming to predict the next one in an auto-regressive manner. To explore scalability and enhance performance, a larger pre-training dataset is collected. Additionally, a subsequent post-pre-training stage is introduced, incorporating a labeled hybrid dataset. Our scalable approach allows for learning high-capacity models that generalize well, achieving state-of-the-art performance on various downstream tasks. In particular, our approach achieves classification accuracies of 94.9% on the ModelNet40 dataset and 93.4% on the ScanObjectNN dataset, outperforming all other transformer models. Furthermore, our method also attains new state-of-the-art accuracies on all four few-shot learning benchmarks. Codes are available at https://github.com/CGuangyan-BIT/PointGPT.
Cite
Text
Chen et al. "PointGPT: Auto-Regressively Generative Pre-Training from Point Clouds." Neural Information Processing Systems, 2023.Markdown
[Chen et al. "PointGPT: Auto-Regressively Generative Pre-Training from Point Clouds." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/chen2023neurips-pointgpt/)BibTeX
@inproceedings{chen2023neurips-pointgpt,
title = {{PointGPT: Auto-Regressively Generative Pre-Training from Point Clouds}},
author = {Chen, Guangyan and Wang, Meiling and Yang, Yi and Yu, Kai and Yuan, Li and Yue, Yufeng},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/chen2023neurips-pointgpt/}
}