Dense View Interpolation on Mobile Devices Using Focal Stacks
Abstract
Light field rendering is a widely used technique to generate novel views of a scene from novel viewpoints. Interpolative methods for light field rendering require a dense description of the scene in the form of closely spaced images. In this work, we present a simple method for dense view interpolation over general static scenes, using commonly available mobile devices. We capture an approximate focal stack of the scene from adjacent camera locations and interpolate intermediate images by shifting each focal region according to appropriate disparities. We do not rely on focus distance control to capture focal stacks and describe an automatic method of estimating the focal textures and the blur and disparity parameters required for view interpolation.
Cite
Text
Sakurikar and Narayanan. "Dense View Interpolation on Mobile Devices Using Focal Stacks." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2014. doi:10.1109/CVPRW.2014.26Markdown
[Sakurikar and Narayanan. "Dense View Interpolation on Mobile Devices Using Focal Stacks." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2014.](https://mlanthology.org/cvprw/2014/sakurikar2014cvprw-dense/) doi:10.1109/CVPRW.2014.26BibTeX
@inproceedings{sakurikar2014cvprw-dense,
title = {{Dense View Interpolation on Mobile Devices Using Focal Stacks}},
author = {Sakurikar, Parikshit and Narayanan, P. J.},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2014},
pages = {138-143},
doi = {10.1109/CVPRW.2014.26},
url = {https://mlanthology.org/cvprw/2014/sakurikar2014cvprw-dense/}
}