Global Convergence Rate of Recurrently Connected Neural Networks
Abstract
We discuss recurrently connected neural networks, investigating their global exponential stability (GES). Some sufficient conditions for a class of recurrent neural networks belonging to GES are given. Sharp convergence rate is given too.
Cite
Text
Chen et al. "Global Convergence Rate of Recurrently Connected Neural Networks." Neural Computation, 2002. doi:10.1162/089976602760805359Markdown
[Chen et al. "Global Convergence Rate of Recurrently Connected Neural Networks." Neural Computation, 2002.](https://mlanthology.org/neco/2002/chen2002neco-global/) doi:10.1162/089976602760805359BibTeX
@article{chen2002neco-global,
title = {{Global Convergence Rate of Recurrently Connected Neural Networks}},
author = {Chen, Tianping and Lu, Wenlian and Amari, Shun-ichi},
journal = {Neural Computation},
year = {2002},
pages = {2947-2957},
doi = {10.1162/089976602760805359},
volume = {14},
url = {https://mlanthology.org/neco/2002/chen2002neco-global/}
}