Event-Driven Stereo Vision for Fall Detection
Abstract
This paper presents a system concept for efficient fall detection in real-time for elderly security in ambient assisted living applications. Event-driven sensors are biologically-inspired and autonomously reacting to scene dynamics and generating events upon relative light intensity change. Their wide dynamic range and high temporal resolution properties enable efficient activity monitoring in natural environment. Using a stereo pair of event-driven sensor chip, it is possible to represent the scene dynamics in a 3D volume at high temporal resolution. Therefore, the person's activity in a home environment can be efficiently recorded, with a low data volume and high temporal resolution that allows efficient incident detection, like person's falls. In this paper, a dataset with scenarios including 68 person's falls has been analyzed for real-time detection with event-driven stereo vision systems and the results are promising.
Cite
Text
Belbachir et al. "Event-Driven Stereo Vision for Fall Detection." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2011. doi:10.1109/CVPRW.2011.5981819Markdown
[Belbachir et al. "Event-Driven Stereo Vision for Fall Detection." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2011.](https://mlanthology.org/cvprw/2011/belbachir2011cvprw-eventdriven/) doi:10.1109/CVPRW.2011.5981819BibTeX
@inproceedings{belbachir2011cvprw-eventdriven,
title = {{Event-Driven Stereo Vision for Fall Detection}},
author = {Belbachir, Ahmed Nabil and Schraml, Stephan and Nowakowska, Aneta},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2011},
pages = {78-83},
doi = {10.1109/CVPRW.2011.5981819},
url = {https://mlanthology.org/cvprw/2011/belbachir2011cvprw-eventdriven/}
}