Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera
Abstract
We propose a method which can perform real-time 3D reconstruction from a single hand-held event camera with no additional sensing, and works in unstructured scenes of which it has no prior knowledge. It is based on three decoupled probabilistic filters, each estimating 6-DoF camera motion, scene logarithmic (log) intensity gradient and scene inverse depth relative to a keyframe, and we build a real-time graph of these to track and model over an extended local workspace. We also upgrade the gradient estimate for each keyframe into an intensity image, allowing us to recover a real-time video-like intensity sequence with spatial and temporal super-resolution from the low bit-rate input event stream. To the best of our knowledge, this is the first algorithm provably able to track a general 6D motion along with reconstruction of arbitrary structure including its intensity and the reconstruction of grayscale video that exclusively relies on event camera data.
Cite
Text
Kim et al. "Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera." European Conference on Computer Vision, 2016. doi:10.1007/978-3-319-46466-4_21Markdown
[Kim et al. "Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera." European Conference on Computer Vision, 2016.](https://mlanthology.org/eccv/2016/kim2016eccv-real/) doi:10.1007/978-3-319-46466-4_21BibTeX
@inproceedings{kim2016eccv-real,
title = {{Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera}},
author = {Kim, Hanme and Leutenegger, Stefan and Davison, Andrew J.},
booktitle = {European Conference on Computer Vision},
year = {2016},
pages = {349-364},
doi = {10.1007/978-3-319-46466-4_21},
url = {https://mlanthology.org/eccv/2016/kim2016eccv-real/}
}