Neural Inverse Rendering for High-Accuracy 3D Measurement of Moving Objects with Fewer Phase-Shifting Patterns

Abstract

Among structured-light methods, the phase-shifting approach enables high-resolution and high-accuracy measurements using a minimum of three patterns. However, its performance is significantly affected when dynamic and complex-shaped objects are measured, as motion artifacts and phase inconsistencies can degrade accuracy. In this study, we propose an enhanced phase-shifting method that incorporates neural inverse rendering to enable the 3D measurement of moving objects. To effectively capture object motion, we introduce a displacement field into the rendering model, which accurately represents positional changes and mitigates motion-induced distortions. Additionally, to achieve high-precision reconstruction with fewer phase-shifting patterns, we design a multiview-rendering framework that utilizes multiple cameras in conjunction with a single projector. Comparisons with state-of-the-art methods and various ablation studies demonstrated that our method accurately reconstructs the shapes of moving objects, even with a small number of patterns, using only simple, well-known phase-shifting patterns.

Cite

Text

Urakawa and Watanabe. "Neural Inverse Rendering for High-Accuracy 3D Measurement of Moving Objects with Fewer Phase-Shifting Patterns." International Conference on Computer Vision, 2025.

Markdown

[Urakawa and Watanabe. "Neural Inverse Rendering for High-Accuracy 3D Measurement of Moving Objects with Fewer Phase-Shifting Patterns." International Conference on Computer Vision, 2025.](https://mlanthology.org/iccv/2025/urakawa2025iccv-neural/)

BibTeX

@inproceedings{urakawa2025iccv-neural,
  title     = {{Neural Inverse Rendering for High-Accuracy 3D Measurement of Moving Objects with Fewer Phase-Shifting Patterns}},
  author    = {Urakawa, Yuki and Watanabe, Yoshihiro},
  booktitle = {International Conference on Computer Vision},
  year      = {2025},
  pages     = {27692-27701},
  url       = {https://mlanthology.org/iccv/2025/urakawa2025iccv-neural/}
}