General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results

Abstract

We survey and summarize the literature on the computational aspects of neural network models by presenting a detailed taxonomy of the various models according to their complexity theoretic characteristics. The criteria of classification include the architecture of the network (feedforward versus recurrent), time model (discrete versus continuous), state type (binary versus analog), weight constraints (symmetric versus asymmetric), network size (finite nets versus infinite families), and computation type (deterministic versus probabilistic), among others. The underlying results concerning the computational power and complexity issues of perceptron, radial basis function, winner-take-all, and spiking neural networks are briefly surveyed, with pointers to the relevant literature. In our survey, we focus mainly on the digital computation whose inputs and outputs are binary in nature, although their values are quite often encoded as analog neuron states. We omit the important learning issues.

Cite

Text

Síma and Orponen. "General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results." Neural Computation, 2003. doi:10.1162/089976603322518731

Markdown

[Síma and Orponen. "General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results." Neural Computation, 2003.](https://mlanthology.org/neco/2003/sima2003neco-generalpurpose/) doi:10.1162/089976603322518731

BibTeX

@article{sima2003neco-generalpurpose,
  title     = {{General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results}},
  author    = {Síma, Jirí and Orponen, Pekka},
  journal   = {Neural Computation},
  year      = {2003},
  pages     = {2727-2778},
  doi       = {10.1162/089976603322518731},
  volume    = {15},
  url       = {https://mlanthology.org/neco/2003/sima2003neco-generalpurpose/}
}