REIN: Flexible Mesh Generation from Point Clouds
Abstract
3D reconstruction from sparse point clouds is a challenging problem. Existing methods interpolate from point clouds to produce meshes, but the performance decreases with the number of points. To address this, we propose an algorithm that looks at the global structure while reconstructing the surface one vertex at a time. Experimental results on ShapeNet and ModelNet10 show 81.5% Chamfer Distance and 14% Point Normal Similarity average improvement compared to Ball Pivoting Algorithm (BPA) and Poisson Surface Reconstruction (PSR). Qualitatively, the generated meshes have a closer similarity to the ground truth. Results on ShapeNet Patched illustrate significant improvement in mesh quality compared to BPA and PSR. The code is available at https://github.com/rangeldaroya/rein.
Cite
Text
Daroya et al. "REIN: Flexible Mesh Generation from Point Clouds." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020. doi:10.1109/CVPRW50498.2020.00184Markdown
[Daroya et al. "REIN: Flexible Mesh Generation from Point Clouds." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020.](https://mlanthology.org/cvprw/2020/daroya2020cvprw-rein/) doi:10.1109/CVPRW50498.2020.00184BibTeX
@inproceedings{daroya2020cvprw-rein,
title = {{REIN: Flexible Mesh Generation from Point Clouds}},
author = {Daroya, Rangel and Atienza, Rowel and Cajote, Rhandley},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2020},
pages = {1444-1453},
doi = {10.1109/CVPRW50498.2020.00184},
url = {https://mlanthology.org/cvprw/2020/daroya2020cvprw-rein/}
}