A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks
Abstract
The real-time recurrent learning (RTRL) algorithm (Robinson and Fallside 1987; Williams and Zipser 1989) requires O(n4) computations per time step, where n is the number of noninput units. I describe a method suited for on-line learning that computes exactly the same gradient and requires fixed-size storage of the same order but has an average time complexity per time step of O(n3).
Cite
Text
Schmidhuber. "A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks." Neural Computation, 1992. doi:10.1162/NECO.1992.4.2.243Markdown
[Schmidhuber. "A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks." Neural Computation, 1992.](https://mlanthology.org/neco/1992/schmidhuber1992neco-fixed/) doi:10.1162/NECO.1992.4.2.243BibTeX
@article{schmidhuber1992neco-fixed,
title = {{A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks}},
author = {Schmidhuber, Jürgen},
journal = {Neural Computation},
year = {1992},
pages = {243-248},
doi = {10.1162/NECO.1992.4.2.243},
volume = {4},
url = {https://mlanthology.org/neco/1992/schmidhuber1992neco-fixed/}
}