Polarization-Based Inverse Rendering from a Single View

Abstract

This paper presents a method to estimate geometrical, photometrical, and environmental information of a single-viewed object in one integrated framework under fixed view-ing position and fixed illumination direction. These three types of information are important to render a photorealis-tic image of a real object. Photometrical information rep-resents the texture and the surface roughness of an object, while geometrical and environmental information represent the 3D shape of an object and the illumination distribution, respectively. The proposed method estimates the 3D shape by computing the surface normal from polarization data, calculates the texture of the object from the diffuse only re-flection component, determines the illumination directions from the position of the brightest intensity in the specu-lar reflection component, and finally computes the surface roughness of the object by using the estimated illumination distribution. 1.

Cite

Text

Miyazaki et al. "Polarization-Based Inverse Rendering from a Single View." IEEE/CVF International Conference on Computer Vision, 2003. doi:10.1109/ICCV.2003.1238455

Markdown

[Miyazaki et al. "Polarization-Based Inverse Rendering from a Single View." IEEE/CVF International Conference on Computer Vision, 2003.](https://mlanthology.org/iccv/2003/miyazaki2003iccv-polarization-a/) doi:10.1109/ICCV.2003.1238455

BibTeX

@inproceedings{miyazaki2003iccv-polarization-a,
  title     = {{Polarization-Based Inverse Rendering from a Single View}},
  author    = {Miyazaki, Daisuke and Tan, Robby T. and Hara, Kenji and Ikeuchi, Katsushi},
  booktitle = {IEEE/CVF International Conference on Computer Vision},
  year      = {2003},
  pages     = {982-987},
  doi       = {10.1109/ICCV.2003.1238455},
  url       = {https://mlanthology.org/iccv/2003/miyazaki2003iccv-polarization-a/}
}