Improved Cooperative Stereo Matching for Dynamic Vision Sensors with Ground Truth Evaluation
Abstract
Event-based vision, as realized by bio-inspired Dynamic Vision Sensors (DVS), is gaining more and more popularity due to its advantages of high temporal resolution, wide dynamic range and power efficiency at the same time. Potential applications include surveillance, robotics, and autonomous navigation under uncontrolled environment conditions. In this paper, we deal with event-based vision for 3D reconstruction of dynamic scene content by using two stationary DVS in a stereo configuration. We focus on a cooperative stereo approach and suggest an improvement over a previously published algorithm that reduces the measured mean error by over 50 percent. An available ground truth data set for stereo event data is utilized to analyze the algorithm's sensitivity to parameter variation and for comparison with competing techniques.
Cite
Text
Piatkowska et al. "Improved Cooperative Stereo Matching for Dynamic Vision Sensors with Ground Truth Evaluation." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2017. doi:10.1109/CVPRW.2017.51Markdown
[Piatkowska et al. "Improved Cooperative Stereo Matching for Dynamic Vision Sensors with Ground Truth Evaluation." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2017.](https://mlanthology.org/cvprw/2017/piatkowska2017cvprw-improved/) doi:10.1109/CVPRW.2017.51BibTeX
@inproceedings{piatkowska2017cvprw-improved,
title = {{Improved Cooperative Stereo Matching for Dynamic Vision Sensors with Ground Truth Evaluation}},
author = {Piatkowska, Ewa and Kogler, Jürgen and Belbachir, Ahmed Nabil and Gelautz, Margrit},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2017},
pages = {370-377},
doi = {10.1109/CVPRW.2017.51},
url = {https://mlanthology.org/cvprw/2017/piatkowska2017cvprw-improved/}
}