A Lightweight Spatiotemporal Network for Online Eye Tracking with Event Camera
Abstract
Event-based data are commonly encountered in edge computing environments where efficiency and low latency are critical. To interface with such data and leverage their rich temporal features, we propose a causal spatiotemporal convolutional network. This solution targets efficient implementation on edge-appropriate hardware with limited resources in three ways: 1) deliberately targets a simple architecture and set of operations (convolutions, ReLU activations) 2) can be configured to perform online inference efficiently via buffering of layer outputs 3) can achieve more than 90% activation sparsity through regularization during training, enabling very significant efficiency gains on event-based processors. In addition, we propose a general affine augmentation strategy acting directly on the events, which alleviates the problem of dataset scarcity for event-based systems. We apply our model on the AIS 2024 event-based eye tracking challenge, reaching a score of 0.9916 p10 accuracy on the Kaggle private testset.
Cite
Text
Pei et al. "A Lightweight Spatiotemporal Network for Online Eye Tracking with Event Camera." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2024. doi:10.1109/CVPRW63382.2024.00587Markdown
[Pei et al. "A Lightweight Spatiotemporal Network for Online Eye Tracking with Event Camera." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2024.](https://mlanthology.org/cvprw/2024/pei2024cvprw-lightweight/) doi:10.1109/CVPRW63382.2024.00587BibTeX
@inproceedings{pei2024cvprw-lightweight,
title = {{A Lightweight Spatiotemporal Network for Online Eye Tracking with Event Camera}},
author = {Pei, Yan Ru and Brüers, Sasskia and Crouzet, Sébastien M. and McLelland, Douglas and Coenen, Olivier},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2024},
pages = {5780-5788},
doi = {10.1109/CVPRW63382.2024.00587},
url = {https://mlanthology.org/cvprw/2024/pei2024cvprw-lightweight/}
}