FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling
Abstract
A new neural network architecture involving either local feedforward global feedforward, and/or local recurrent global feedforward structure is proposed. A learning rule minimizing a mean square error criterion is derived. The performance of this algorithm (local recurrent global feedforward architecture) is compared with a local-feedforward global-feedforward architecture. It is shown that the local-recurrent global-feedforward model performs better than the local-feedforward global-feedforward model.
Cite
Text
Back and Tsoi. "FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling." Neural Computation, 1991. doi:10.1162/NECO.1991.3.3.375Markdown
[Back and Tsoi. "FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling." Neural Computation, 1991.](https://mlanthology.org/neco/1991/back1991neco-fir/) doi:10.1162/NECO.1991.3.3.375BibTeX
@article{back1991neco-fir,
title = {{FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling}},
author = {Back, Andrew D. and Tsoi, Ah Chung},
journal = {Neural Computation},
year = {1991},
pages = {375-385},
doi = {10.1162/NECO.1991.3.3.375},
volume = {3},
url = {https://mlanthology.org/neco/1991/back1991neco-fir/}
}