Integration and Differentiation in Dynamic Recurrent Neural Networks

Abstract

Dynamic neural networks with recurrent connections were trained by backpropagation to generate the differential or the leaky integral of a nonrepeating frequency-modulated sinusoidal signal. The trained networks performed these operations on arbitrary input waveforms. Reducing the network size by deleting ineffective hidden units and combining redundant units, and then retraining the network produced a minimal network that computed the same function and revealed the underlying computational algorithm. Networks could also be trained to compute simultaneously the differential and integral of the input on two outputs; the two operations were performed in distributed overlapping fashion, and the activations of the hidden units were dominated by the integral. Incorporating units with time constants into model networks generally enhanced their performance as integrators and interfered with their ability to differentiate.

Cite

Text

Munro et al. "Integration and Differentiation in Dynamic Recurrent Neural Networks." Neural Computation, 1994. doi:10.1162/NECO.1994.6.3.405

Markdown

[Munro et al. "Integration and Differentiation in Dynamic Recurrent Neural Networks." Neural Computation, 1994.](https://mlanthology.org/neco/1994/munro1994neco-integration/) doi:10.1162/NECO.1994.6.3.405

BibTeX

@article{munro1994neco-integration,
  title     = {{Integration and Differentiation in Dynamic Recurrent Neural Networks}},
  author    = {Munro, Edwin E. and Shupe, Larry E. and Fetz, Eberhard E.},
  journal   = {Neural Computation},
  year      = {1994},
  pages     = {405-419},
  doi       = {10.1162/NECO.1994.6.3.405},
  volume    = {6},
  url       = {https://mlanthology.org/neco/1994/munro1994neco-integration/}
}