SkipW: Resource Adaptable RNN with Strict Upper Computational Limit

Abstract

We introduce Skip-Window, a method to allow recurrent neural networks (RNNs) to trade off accuracy for computational cost during the analysis of a sequence. Similarly to existing approaches, Skip-Window extends existing RNN cells by adding a mechanism to encourage the model to process fewer inputs. Unlike existing approaches, Skip-Window is able to respect a strict computational budget, making this model more suitable for limited hardware. We evaluate this approach on two datasets: a human activity recognition task and adding task. Our results show that Skip-Window is able to exceed the accuracy of existing approaches for a lower computational cost while strictly limiting said cost.

Cite

Text

Mayet et al. "SkipW: Resource Adaptable RNN with Strict Upper Computational Limit." International Conference on Learning Representations, 2021.

Markdown

[Mayet et al. "SkipW: Resource Adaptable RNN with Strict Upper Computational Limit." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/mayet2021iclr-skipw/)

BibTeX

@inproceedings{mayet2021iclr-skipw,
  title     = {{SkipW: Resource Adaptable RNN with Strict Upper Computational Limit}},
  author    = {Mayet, Tsiry and Lambert, Anne and Leguyadec, Pascal and Le Bolzer, Francoise and Schnitzler, François},
  booktitle = {International Conference on Learning Representations},
  year      = {2021},
  url       = {https://mlanthology.org/iclr/2021/mayet2021iclr-skipw/}
}