Fixed Point Analysis for Recurrent Networks
Abstract
This paper provides a systematic analysis of the recurrent backpropaga(cid:173) tion (RBP) algorithm, introducing a number of new results. The main limitation of the RBP algorithm is that it assumes the convergence of the network to a stable fixed point in order to backpropagate the error signals. We show by experiment and eigenvalue analysis that this condi(cid:173) tion can be violated and that chaotic behavior can be avoided. Next we examine the advantages of RBP over the standard backpropagation al(cid:173) gorithm. RBP is shown to build stable fixed points corresponding to the input patterns. This makes it an appropriate tool for content address(cid:173) able memories, one-to-many function learning, and inverse problems.
Cite
Text
Simard et al. "Fixed Point Analysis for Recurrent Networks." Neural Information Processing Systems, 1988.Markdown
[Simard et al. "Fixed Point Analysis for Recurrent Networks." Neural Information Processing Systems, 1988.](https://mlanthology.org/neurips/1988/simard1988neurips-fixed/)BibTeX
@inproceedings{simard1988neurips-fixed,
title = {{Fixed Point Analysis for Recurrent Networks}},
author = {Simard, Patrice Y. and Ottaway, Mary B. and Ballard, Dana H.},
booktitle = {Neural Information Processing Systems},
year = {1988},
pages = {149-159},
url = {https://mlanthology.org/neurips/1988/simard1988neurips-fixed/}
}