How to Calibrate Your Event Camera

Abstract

We propose a generic event camera calibration framework using image reconstruction. Instead of relying on blinking LED patterns or external screens, we show that neural-network–based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras. The advantage of our proposed approach is that we can use standard calibration patterns that do not rely on active illumination. Furthermore, our approach enables the possibility to perform extrinsic calibration between frame-based and event-based sensors without additional complexity. Both simulation and real-world experiments indicate that calibration through image reconstruction is accurate under common distortion models and a wide variety of distortion parameters.

Cite

Text

Muglikar et al. "How to Calibrate Your Event Camera." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2021. doi:10.1109/CVPRW53098.2021.00155

Markdown

[Muglikar et al. "How to Calibrate Your Event Camera." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2021.](https://mlanthology.org/cvprw/2021/muglikar2021cvprw-calibrate/) doi:10.1109/CVPRW53098.2021.00155

BibTeX

@inproceedings{muglikar2021cvprw-calibrate,
  title     = {{How to Calibrate Your Event Camera}},
  author    = {Muglikar, Manasi and Gehrig, Mathias and Gehrig, Daniel and Scaramuzza, Davide},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2021},
  pages     = {1403-1409},
  doi       = {10.1109/CVPRW53098.2021.00155},
  url       = {https://mlanthology.org/cvprw/2021/muglikar2021cvprw-calibrate/}
}