Event-Aided Direct Sparse Odometry
Abstract
We introduce EDS, a direct monocular visual odometry using events and frames. Our algorithm leverages the event generation model to track the camera motion in the blind time between frames. The method formulates a direct probabilistic approach of observed brightness increments. Per-pixel brightness increments are predicted using a sparse number of selected 3D points and are compared to the events via the brightness increment error to estimate camera motion. The method recovers a semi-dense 3D map using photometric bundle adjustment. EDS is the first method to perform 6-DOF VO using events and frames with a direct approach. By design it overcomes the problem of changing appearance in indirect methods. Our results outperform all previous event-based odometry solutions. We also show that, for a target error performance, EDS can work at lower frame rates than state-of-the-art frame-based VO solutions. This opens the door to low-power motion-tracking applications where frames are sparingly triggered "on demand" and our method tracks the motion in between. We release code and datasets to the public.
Cite
Text
Hidalgo-Carrió et al. "Event-Aided Direct Sparse Odometry." Conference on Computer Vision and Pattern Recognition, 2022. doi:10.1109/CVPR52688.2022.00569Markdown
[Hidalgo-Carrió et al. "Event-Aided Direct Sparse Odometry." Conference on Computer Vision and Pattern Recognition, 2022.](https://mlanthology.org/cvpr/2022/hidalgocarrio2022cvpr-eventaided/) doi:10.1109/CVPR52688.2022.00569BibTeX
@inproceedings{hidalgocarrio2022cvpr-eventaided,
title = {{Event-Aided Direct Sparse Odometry}},
author = {Hidalgo-Carrió, Javier and Gallego, Guillermo and Scaramuzza, Davide},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2022},
pages = {5781-5790},
doi = {10.1109/CVPR52688.2022.00569},
url = {https://mlanthology.org/cvpr/2022/hidalgocarrio2022cvpr-eventaided/}
}