Convergence Rate of Minimization Learning for Neural Networks

Abstract

In this paper, we present the convergence rate of the error in a neural network which was learnt by a constructive method. The constructive mechanism is used to learn the neural network by adding hidden units to this neural network. The main idea of this work is to find the eigenvalues of the transformation matrix concerning the error before and after adding hidden units in the neural network. By using the eigenvalues, we show the relation between the convergence rate in neural networks without and with thresholds in the output layer.

Cite

Text

Mohamed et al. "Convergence Rate of Minimization Learning for Neural Networks." European Conference on Machine Learning, 1998. doi:10.1007/BFB0026712

Markdown

[Mohamed et al. "Convergence Rate of Minimization Learning for Neural Networks." European Conference on Machine Learning, 1998.](https://mlanthology.org/ecmlpkdd/1998/mohamed1998ecml-convergence/) doi:10.1007/BFB0026712

BibTeX

@inproceedings{mohamed1998ecml-convergence,
  title     = {{Convergence Rate of Minimization Learning for Neural Networks}},
  author    = {Mohamed, Marghny H. and Minamoto, Teruya and Niijima, Koichi},
  booktitle = {European Conference on Machine Learning},
  year      = {1998},
  pages     = {412-417},
  doi       = {10.1007/BFB0026712},
  url       = {https://mlanthology.org/ecmlpkdd/1998/mohamed1998ecml-convergence/}
}