Universal Approximation by Phase Series and Fixed-Weight Networks

Abstract

In this note we show that weak (specified energy bound) universal approximation by neural networks is possible if variable synaptic weights are brought in as network inputs rather than being embedded in a network. We illustrate this idea with a Fourier series network that we transform into what we call a phase series network. The transformation only increases the number of neurons by a factor of two.

Cite

Text

Cotter and Conwell. "Universal Approximation by Phase Series and Fixed-Weight Networks." Neural Computation, 1993. doi:10.1162/NECO.1993.5.3.359

Markdown

[Cotter and Conwell. "Universal Approximation by Phase Series and Fixed-Weight Networks." Neural Computation, 1993.](https://mlanthology.org/neco/1993/cotter1993neco-universal/) doi:10.1162/NECO.1993.5.3.359

BibTeX

@article{cotter1993neco-universal,
  title     = {{Universal Approximation by Phase Series and Fixed-Weight Networks}},
  author    = {Cotter, Neil E. and Conwell, Peter R.},
  journal   = {Neural Computation},
  year      = {1993},
  pages     = {359-362},
  doi       = {10.1162/NECO.1993.5.3.359},
  volume    = {5},
  url       = {https://mlanthology.org/neco/1993/cotter1993neco-universal/}
}