Neural Shape Compiler: A Unified Framework for Transforming Between Text, Point Cloud, and Program
Abstract
3D shapes have complementary abstractions from low-level geometry to part-based hierarchies to languages, which convey different levels of information. This paper presents a unified framework to translate between pairs of shape abstractions: $\textit{Text}$ $\Longleftrightarrow$ $\textit{Point Cloud}$ $\Longleftrightarrow$ $\textit{Program}$. We propose $\textbf{\textit{Neural Shape Compiler}}$ to model the abstraction transformation as a conditional generation process. It converts 3D shapes of three abstract types into unified discrete shape code, transforms each shape code into code of other abstract types through the proposed $\textit{ShapeCode Transformer}$, and decodes them to output the target shape abstraction. Point Cloud code is obtained in a class-agnostic way by the proposed $\textit{Point}$VQVAE. On Text2Shape, ShapeGlot, ABO, Genre, and Program Synthetic datasets, Neural Shape Compiler shows strengths in $\textit{Text}$ $\Longrightarrow$ $\textit{Point Cloud}$, $\textit{Point Cloud}$ $\Longrightarrow$ $\textit{Text}$, $\textit{Point Cloud}$ $\Longrightarrow$ $\textit{Program}$, and Point Cloud Completion tasks. Additionally, Neural Shape Compiler benefits from jointly training on all heterogeneous data and tasks.
Cite
Text
Luo et al. "Neural Shape Compiler: A Unified Framework for Transforming Between Text, Point Cloud, and Program." Transactions on Machine Learning Research, 2023.Markdown
[Luo et al. "Neural Shape Compiler: A Unified Framework for Transforming Between Text, Point Cloud, and Program." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/luo2023tmlr-neural/)BibTeX
@article{luo2023tmlr-neural,
title = {{Neural Shape Compiler: A Unified Framework for Transforming Between Text, Point Cloud, and Program}},
author = {Luo, Tiange and Lee, Honglak and Johnson, Justin},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/luo2023tmlr-neural/}
}