Asynchronous Convolutional Networks for Object Detection in Neuromorphic Cameras

Abstract

Event-based cameras, also known as neuromorphic cameras, are bioinspired sensors able to perceive changes in the scene at high frequency with low power consumption. Becoming available only very recently, a limited amount of work addresses object detection on these devices. In this paper we propose two neural networks architectures for object detection: YOLE, which integrates the events into surfaces and uses a frame-based model to process them, and fcYOLE, an asynchronous event-based fully convolutional network which uses a novel and general formalization of the convolutional and max pooling layers to exploit the sparsity of camera events. We evaluate the algorithm with different extensions of publicly available datasets, and on a novel synthetic dataset.

Cite

Text

Cannici et al. "Asynchronous Convolutional Networks for Object Detection in Neuromorphic Cameras." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019. doi:10.1109/CVPRW.2019.00209

Markdown

[Cannici et al. "Asynchronous Convolutional Networks for Object Detection in Neuromorphic Cameras." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019.](https://mlanthology.org/cvprw/2019/cannici2019cvprw-asynchronous/) doi:10.1109/CVPRW.2019.00209

BibTeX

@inproceedings{cannici2019cvprw-asynchronous,
  title     = {{Asynchronous Convolutional Networks for Object Detection in Neuromorphic Cameras}},
  author    = {Cannici, Marco and Ciccone, Marco and Romanoni, Andrea and Matteucci, Matteo},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2019},
  pages     = {1656-1665},
  doi       = {10.1109/CVPRW.2019.00209},
  url       = {https://mlanthology.org/cvprw/2019/cannici2019cvprw-asynchronous/}
}