Spatially-Varying Outdoor Lighting Estimation from Intrinsics

Abstract

We present SOLID-Net, a neural network for spatially-varying outdoor lighting estimation from a single outdoor image for any 2D pixel location. Previous work has used a unified sky environment map to represent outdoor lighting. Instead, we generate spatially-varying local lighting environment maps by combining global sky environment map with warped image information according to geometric information estimated from intrinsics. As no outdoor dataset with image and local lighting ground truth is readily available, we introduce SOLID-Img dataset with physically-based rendered images and their corresponding intrinsic and lighting information. We train a deep neural network to regress intrinsic cues with physically-based constrains and use them to conduct global and local lightings estimation. Experiments on both synthetic and real datasets show that SOLID-Net significantly outperforms previous methods.

Cite

Text

Zhu et al. "Spatially-Varying Outdoor Lighting Estimation from Intrinsics." Conference on Computer Vision and Pattern Recognition, 2021. doi:10.1109/CVPR46437.2021.01264

Markdown

[Zhu et al. "Spatially-Varying Outdoor Lighting Estimation from Intrinsics." Conference on Computer Vision and Pattern Recognition, 2021.](https://mlanthology.org/cvpr/2021/zhu2021cvpr-spatiallyvarying/) doi:10.1109/CVPR46437.2021.01264

BibTeX

@inproceedings{zhu2021cvpr-spatiallyvarying,
  title     = {{Spatially-Varying Outdoor Lighting Estimation from Intrinsics}},
  author    = {Zhu, Yongjie and Zhang, Yinda and Li, Si and Shi, Boxin},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2021},
  pages     = {12834-12842},
  doi       = {10.1109/CVPR46437.2021.01264},
  url       = {https://mlanthology.org/cvpr/2021/zhu2021cvpr-spatiallyvarying/}
}