LiDAR-Event Stereo Fusion with Hallucinations

Abstract

Event stereo matching is an emerging technique to estimate depth from neuromorphic cameras; however, events are unlikely to trigger in the absence of motion or the presence of large, untextured regions, making the correspondence problem extremely challenging. Purposely, we propose integrating a stereo event camera with a fixed-frequency active sensor – e.g., a LiDAR – collecting sparse depth measurements, overcoming the aforementioned limitations. Such depth hints are used by hallucinating – i.e., inserting fictitious events – the stacks or raw input streams, compensating for the lack of information in the absence of brightness changes. Our techniques are general, can be adapted to any structured representation to stack events and outperform state-of-the-art fusion methods applied to event-based stereo.

Cite

Text

Bartolomei et al. "LiDAR-Event Stereo Fusion with Hallucinations." Proceedings of the European Conference on Computer Vision (ECCV), 2024. doi:10.1007/978-3-031-72658-3_8

Markdown

[Bartolomei et al. "LiDAR-Event Stereo Fusion with Hallucinations." Proceedings of the European Conference on Computer Vision (ECCV), 2024.](https://mlanthology.org/eccv/2024/bartolomei2024eccv-lidarevent/) doi:10.1007/978-3-031-72658-3_8

BibTeX

@inproceedings{bartolomei2024eccv-lidarevent,
  title     = {{LiDAR-Event Stereo Fusion with Hallucinations}},
  author    = {Bartolomei, Luca and Poggi, Matteo and Conti, Andrea and Mattoccia, Stefano},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2024},
  doi       = {10.1007/978-3-031-72658-3_8},
  url       = {https://mlanthology.org/eccv/2024/bartolomei2024eccv-lidarevent/}
}