Ambient Occlusion via Compressive Visibility Estimation
Abstract
There has been emerging interest on recovering traditionally challenging intrinsic scene properties. In this paper, we present a novel computational imaging solution for recovering the ambient occlusion (AO) map of an object. AO measures how much light from all different directions can reach a surface point without being blocked by self-occlusions. Previous approaches either require obtaining highly accurate surface geometry or acquiring a large number of images. We adopt a compressive sensing framework that captures the object under strategically coded lighting directions. We show that this incident illumination field exhibits some unique properties suitable for AO recovery: every ray's contribution to the visibility function is binary while their distribution for AO measurement is sparse. This enables a sparsity-prior based solution for iteratively recovering the surface normal, the surface albedo, and the visibility function from a small number of images. To physically implement the scheme, we construct an encodable directional light source using the light field probe. Experiments on synthetic and real scenes show that our approach is both reliable and accurate with significantly reduced size of input.
Cite
Text
Yang et al. "Ambient Occlusion via Compressive Visibility Estimation." Conference on Computer Vision and Pattern Recognition, 2015. doi:10.1109/CVPR.2015.7299013Markdown
[Yang et al. "Ambient Occlusion via Compressive Visibility Estimation." Conference on Computer Vision and Pattern Recognition, 2015.](https://mlanthology.org/cvpr/2015/yang2015cvpr-ambient/) doi:10.1109/CVPR.2015.7299013BibTeX
@inproceedings{yang2015cvpr-ambient,
title = {{Ambient Occlusion via Compressive Visibility Estimation}},
author = {Yang, Wei and Ji, Yu and Lin, Haiting and Yang, Yang and Kang, Sing Bing and Yu, Jingyi},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2015},
doi = {10.1109/CVPR.2015.7299013},
url = {https://mlanthology.org/cvpr/2015/yang2015cvpr-ambient/}
}