CNNs Efficiently Learn Long-Range Dependencies

Abstract

The role of feedback (or recurrent) connections is a fundamental question in neuroscience and machine learning. Recently, two benchmarks [1,2], which require following paths in images, have been proposed as examples where recurrence was considered helpful for efficiently solving them. In this work, we demonstrate that these tasks can be solved equally well or even better using a single efficient convolutional feed-forward neural network architecture. We analyze ResNet training regarding model complexity and sample efficiency and show that a narrow, parameter-efficient ResNet performs on par with the recurrent and computationally more complex hCNN and td+hCNN models from previous work on both benchmarks. Code: https://eckerlab.org/code/cnn-efficient-path-tracing

Cite

Text

Lüddecke and Ecker. "CNNs Efficiently Learn Long-Range Dependencies." NeurIPS 2020 Workshops: SVRHM, 2020.

Markdown

[Lüddecke and Ecker. "CNNs Efficiently Learn Long-Range Dependencies." NeurIPS 2020 Workshops: SVRHM, 2020.](https://mlanthology.org/neuripsw/2020/luddecke2020neuripsw-cnns/)

BibTeX

@inproceedings{luddecke2020neuripsw-cnns,
  title     = {{CNNs Efficiently Learn Long-Range Dependencies}},
  author    = {Lüddecke, Timo and Ecker, Alexander S},
  booktitle = {NeurIPS 2020 Workshops: SVRHM},
  year      = {2020},
  url       = {https://mlanthology.org/neuripsw/2020/luddecke2020neuripsw-cnns/}
}