Gated Recurrent Neural Networks with Weighted Time-Delay Feedback
Abstract
In this paper, we present a novel approach to modeling long-term dependencies in sequential data by introducing a gated recurrent unit (GRU) with a weighted time-delay feedback mechanism. Our proposed model, named $\tau$-GRU, is a discretized version of a continuous-time formulation of a recurrent unit, where the dynamics are governed by delay differential equations (DDEs). We prove the existence and uniqueness of solutions for the continuous-time model and show that the proposed feedback mechanism can significantly improve the modeling of long-term dependencies. Our empirical results indicate that $\tau$-GRU outperforms state-of-the-art recurrent units and gated recurrent architectures on a range of tasks, achieving faster convergence and better generalization.
Cite
Text
Erichson et al. "Gated Recurrent Neural Networks with Weighted Time-Delay Feedback." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.Markdown
[Erichson et al. "Gated Recurrent Neural Networks with Weighted Time-Delay Feedback." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.](https://mlanthology.org/aistats/2025/erichson2025aistats-gated/)BibTeX
@inproceedings{erichson2025aistats-gated,
title = {{Gated Recurrent Neural Networks with Weighted Time-Delay Feedback}},
author = {Erichson, N. Benjamin and Lim, Soon Hoe and Mahoney, Michael W.},
booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics},
year = {2025},
pages = {3646-3654},
volume = {258},
url = {https://mlanthology.org/aistats/2025/erichson2025aistats-gated/}
}