IPDC: Iterative Part-Based Dense Correspondence Between Point Clouds
Abstract
Color point clouds from 3D scanners are a representation of real-world geometry and color. However, such scan data are imperfect, containing noise, outliers, and occlusions. Noise-free point clouds can be computed by virtually scanning 3D CAD models from a pre-built library, but their geometry may differ from real-world objects. We describe a new algorithm to automatically compute dense correspondences between point cloud scans of same-type objects, thus making it possible to transfer real-world color from noisy scans (source scan) to noise-free virtual scans (target scan), even in cases where the scan objects differ. The method segments both point clouds into parts and then computes part correspondences between them. An iterative algorithm applies a set of rigid transformations to the corresponding parts to determine a dense mapping between them. The dense mapping allows color or other parameter transfers. The resulting point cloud has the geometry of the virtual scan and the color from the real-world scan, mapped in a semantically consistent manner.
Cite
Text
Qiu and Neumann. "IPDC: Iterative Part-Based Dense Correspondence Between Point Clouds." IEEE/CVF Winter Conference on Applications of Computer Vision, 2016. doi:10.1109/WACV.2016.7477626Markdown
[Qiu and Neumann. "IPDC: Iterative Part-Based Dense Correspondence Between Point Clouds." IEEE/CVF Winter Conference on Applications of Computer Vision, 2016.](https://mlanthology.org/wacv/2016/qiu2016wacv-ipdc/) doi:10.1109/WACV.2016.7477626BibTeX
@inproceedings{qiu2016wacv-ipdc,
title = {{IPDC: Iterative Part-Based Dense Correspondence Between Point Clouds}},
author = {Qiu, Rongqi and Neumann, Ulrich},
booktitle = {IEEE/CVF Winter Conference on Applications of Computer Vision},
year = {2016},
pages = {1-9},
doi = {10.1109/WACV.2016.7477626},
url = {https://mlanthology.org/wacv/2016/qiu2016wacv-ipdc/}
}