Learning Long-Range Spatial Dependencies with Horizontal Gated Recurrent Units
Abstract
Progress in deep learning has spawned great successes in many engineering applications. As a prime example, convolutional neural networks, a type of feedforward neural networks, are now approaching -- and sometimes even surpassing -- human accuracy on a variety of visual recognition tasks. Here, however, we show that these neural networks and their recent extensions struggle in recognition tasks where co-dependent visual features must be detected over long spatial ranges. We introduce a visual challenge, Pathfinder, and describe a novel recurrent neural network architecture called the horizontal gated recurrent unit (hGRU) to learn intrinsic horizontal connections -- both within and across feature columns. We demonstrate that a single hGRU layer matches or outperforms all tested feedforward hierarchical baselines including state-of-the-art architectures with orders of magnitude more parameters.
Cite
Text
Linsley et al. "Learning Long-Range Spatial Dependencies with Horizontal Gated Recurrent Units." Neural Information Processing Systems, 2018.Markdown
[Linsley et al. "Learning Long-Range Spatial Dependencies with Horizontal Gated Recurrent Units." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/linsley2018neurips-learning/)BibTeX
@inproceedings{linsley2018neurips-learning,
title = {{Learning Long-Range Spatial Dependencies with Horizontal Gated Recurrent Units}},
author = {Linsley, Drew and Kim, Junkyung and Veerabadran, Vijay and Windolf, Charles and Serre, Thomas},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {152-164},
url = {https://mlanthology.org/neurips/2018/linsley2018neurips-learning/}
}