Efficient Plane-Based Optimization of Geometry and Texture for Indoor RGB-D Reconstruction
Abstract
We propose a novel approach to reconstruct RGB-D indoor scene based on plane primitives. Our approach takes as input a RGB-D sequence and a dense coarse mesh reconstructed from it, and generates a lightweight, low-polygonal mesh with clear face textures and sharp features without losing geometry details from the original scene. Compared to existing methods which only cover large planar regions in the scene, our method builds the entire scene by adaptive planes without losing geometry details and also preserves sharp features in the mesh. Experiments show that our method is more efficient to generate textured mesh from RGB-D data than state-of-the-arts.
Cite
Text
Wang and Guo. "Efficient Plane-Based Optimization of Geometry and Texture for Indoor RGB-D Reconstruction." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019.Markdown
[Wang and Guo. "Efficient Plane-Based Optimization of Geometry and Texture for Indoor RGB-D Reconstruction." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019.](https://mlanthology.org/cvprw/2019/wang2019cvprw-efficient/)BibTeX
@inproceedings{wang2019cvprw-efficient,
title = {{Efficient Plane-Based Optimization of Geometry and Texture for Indoor RGB-D Reconstruction}},
author = {Wang, Chao and Guo, Xiaohu},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2019},
pages = {49-53},
url = {https://mlanthology.org/cvprw/2019/wang2019cvprw-efficient/}
}