Learning Event-Based Height from Plane and Parallax

Abstract

In this work, we propose a fast method to perform event-based structure estimation for vehicles traveling in a roughly 2D environment (e.g. in an environment with a ground plane). Our method transfers the method of plane and parallax to events, which, given the homography to a ground plane and the pose of the camera, generates a warping of the events which removes the optical flow for events on the ground plane, while inducing flow for events above the ground plane. We then estimate dense flow in this warped space using a self-supervised neural network, which provides the height of all points in the scene. We evaluate our method on the Multi Vehicle Stereo Event Camera dataset, and show its ability to rapidly estimate the scene structure both at high speeds and in low lighting conditions.

Cite

Text

Chaney et al. "Learning Event-Based Height from Plane and Parallax." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019. doi:10.1109/CVPRW.2019.00206

Markdown

[Chaney et al. "Learning Event-Based Height from Plane and Parallax." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019.](https://mlanthology.org/cvprw/2019/chaney2019cvprw-learning/) doi:10.1109/CVPRW.2019.00206

BibTeX

@inproceedings{chaney2019cvprw-learning,
  title     = {{Learning Event-Based Height from Plane and Parallax}},
  author    = {Chaney, Kenneth and Zhu, Alex Zihao and Daniilidis, Kostas},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2019},
  pages     = {1634-1637},
  doi       = {10.1109/CVPRW.2019.00206},
  url       = {https://mlanthology.org/cvprw/2019/chaney2019cvprw-learning/}
}