Time Adaptive Recurrent Neural Network

Abstract

We propose a learning method that, dynamically modifies the time-constants of the continuous-time counterpart of a vanilla RNN. The time-constants are modified based on the current observation and hidden state. Our proposal overcomes the issues of RNN trainability, by mitigating exploding and vanishing gradient phenomena based on placing novel constraints on the parameter space, and by suppressing noise in inputs based on pondering over informative inputs to strengthen their contribution in the hidden state. As a result, our method is computationally efficient overcoming overheads of many existing methods that also attempt to improve RNN training. Our RNNs, despite being simpler and having light memory footprint, shows competitive performance against standard LSTMs and baseline RNN models on many benchmark datasets including those that require long-term memory.

Cite

Text

Kag and Saligrama. "Time Adaptive Recurrent Neural Network." Conference on Computer Vision and Pattern Recognition, 2021. doi:10.1109/CVPR46437.2021.01490

Markdown

[Kag and Saligrama. "Time Adaptive Recurrent Neural Network." Conference on Computer Vision and Pattern Recognition, 2021.](https://mlanthology.org/cvpr/2021/kag2021cvpr-time/) doi:10.1109/CVPR46437.2021.01490

BibTeX

@inproceedings{kag2021cvpr-time,
  title     = {{Time Adaptive Recurrent Neural Network}},
  author    = {Kag, Anil and Saligrama, Venkatesh},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2021},
  pages     = {15149-15158},
  doi       = {10.1109/CVPR46437.2021.01490},
  url       = {https://mlanthology.org/cvpr/2021/kag2021cvpr-time/}
}