Credit Assignment Through Time: Alternatives to Backpropagation

Abstract

Learning to recognize or predict sequences using long-term con(cid:173) text has many applications. However, practical and theoretical problems are found in training recurrent neural networks to per(cid:173) form tasks in which input/output dependencies span long intervals. Starting from a mathematical analysis of the problem, we consider and compare alternative algorithms and architectures on tasks for which the span of the input/output dependencies can be controlled. Results on the new algorithms show performance qualitatively su(cid:173) perior to that obtained with backpropagation.

Cite

Text

Bengio and Frasconi. "Credit Assignment Through Time: Alternatives to Backpropagation." Neural Information Processing Systems, 1993.

Markdown

[Bengio and Frasconi. "Credit Assignment Through Time: Alternatives to Backpropagation." Neural Information Processing Systems, 1993.](https://mlanthology.org/neurips/1993/bengio1993neurips-credit/)

BibTeX

@inproceedings{bengio1993neurips-credit,
  title     = {{Credit Assignment Through Time: Alternatives to Backpropagation}},
  author    = {Bengio, Yoshua and Frasconi, Paolo},
  booktitle = {Neural Information Processing Systems},
  year      = {1993},
  pages     = {75-82},
  url       = {https://mlanthology.org/neurips/1993/bengio1993neurips-credit/}
}