Best Linear Unbiased Estimation for 2D and 3D Flow with Event-Based Cameras

Abstract

Dynamic Vision Sensors (DVS) provide low-latency, high-dynamic-range motion estimation, but their real-time applicability is often limited by the computational complexity and latency overheads introduced by iterative motion compensation techniques. In this work, we propose a novel probabilistic model that leverages the stochastic distribution of events along moving edges. Using our model, we introduce a lightweight patch-based algorithm that employs a linear combination of event spatial coordinates, making it highly suitable for implementation on specialized hardware. Our approach exhibits linear scalability with dimensionality, making it suitable for emerging event-based 3D sensors, such as Light-Field DVS (LF-DVS). Experimental results validate the efficiency and scalability of our method, establishing a solid foundation for real-time event-based ultra-efficient 2D and 3D motion estimation.

Cite

Text

Valerdi and Iturbe. "Best Linear Unbiased Estimation for 2D and 3D Flow with Event-Based Cameras." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2025.

Markdown

[Valerdi and Iturbe. "Best Linear Unbiased Estimation for 2D and 3D Flow with Event-Based Cameras." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2025.](https://mlanthology.org/cvprw/2025/valerdi2025cvprw-best/)

BibTeX

@inproceedings{valerdi2025cvprw-best,
  title     = {{Best Linear Unbiased Estimation for 2D and 3D Flow with Event-Based Cameras}},
  author    = {Valerdi, Juan Luis and Iturbe, Xabier},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2025},
  pages     = {4908-4917},
  url       = {https://mlanthology.org/cvprw/2025/valerdi2025cvprw-best/}
}