Degree of Approximation Results for Feedforward Networks Approximating Unknown Mappings and Their Derivatives

Abstract

Recently Barron (1993) has given rates for hidden layer feedforward networks with sigmoid activation functions approximating a class of functions satisfying a certain smoothness condition. These rates do not depend on the dimension of the input space. We extend Barron's results to feedforward networks with possibly nonsigmoid activation functions approximating mappings and their derivatives simultaneously. Our conditions are similar but not identical to Barron's, but we obtain the same rates of approximation, showing that the approximation error decreases at rates as fast as n−1/2, where n is the number of hidden units. The dimension of the input space appears only in the constants of our bounds.

Cite

Text

Hornik et al. "Degree of Approximation Results for Feedforward Networks Approximating Unknown Mappings and Their Derivatives." Neural Computation, 1994. doi:10.1162/NECO.1994.6.6.1262

Markdown

[Hornik et al. "Degree of Approximation Results for Feedforward Networks Approximating Unknown Mappings and Their Derivatives." Neural Computation, 1994.](https://mlanthology.org/neco/1994/hornik1994neco-degree/) doi:10.1162/NECO.1994.6.6.1262

BibTeX

@article{hornik1994neco-degree,
  title     = {{Degree of Approximation Results for Feedforward Networks Approximating Unknown Mappings and Their Derivatives}},
  author    = {Hornik, Kurt and Stinchcombe, Maxwell B. and White, Halbert and Auer, Peter},
  journal   = {Neural Computation},
  year      = {1994},
  pages     = {1262-1275},
  doi       = {10.1162/NECO.1994.6.6.1262},
  volume    = {6},
  url       = {https://mlanthology.org/neco/1994/hornik1994neco-degree/}
}