BlinkTrack: Feature Tracking over 80 FPS via Events and Images
Abstract
Event cameras, known for their high temporal resolution and ability to capture asynchronous changes, have gained significant attention for their potential in feature tracking, especially in challenging conditions. However, event cameras lack the fine-grained texture information that conventional cameras provide, leading to error accumulation in tracking. To address this, we propose a novel framework, BlinkTrack, which integrates event data with grayscale images for high-frequency feature tracking. Our method extends the traditional Kalman filter into a learning-based framework, utilizing differentiable Kalman filters in both event and image branches. This approach improves single-modality tracking and effectively solves the data association and fusion from asynchronous event and image data. We also introduce new synthetic and augmented datasets to better evaluate our model. Experimental results indicate that BlinkTrack significantly outperforms existing methods, exceeding 80 FPS with multi-modality data and 100 FPS with preprocessed event data. Codes and dataset are available at https://github.com/ColieShen/BlinkTrack.
Cite
Text
Shen et al. "BlinkTrack: Feature Tracking over 80 FPS via Events and Images." International Conference on Computer Vision, 2025.Markdown
[Shen et al. "BlinkTrack: Feature Tracking over 80 FPS via Events and Images." International Conference on Computer Vision, 2025.](https://mlanthology.org/iccv/2025/shen2025iccv-blinktrack/)BibTeX
@inproceedings{shen2025iccv-blinktrack,
title = {{BlinkTrack: Feature Tracking over 80 FPS via Events and Images}},
author = {Shen, Yichen and Li, Yijin and Chen, Shuo and Li, Guanglin and Huang, Zhaoyang and Bao, Hujun and Cui, Zhaopeng and Zhang, Guofeng},
booktitle = {International Conference on Computer Vision},
year = {2025},
pages = {9298-9308},
url = {https://mlanthology.org/iccv/2025/shen2025iccv-blinktrack/}
}