Analog Neural Nets with Gaussian or Other Common Noise Distribution Cannot Recognize Arbitrary Regular Languages

Abstract

We consider recurrent analog neural nets where the output of each gate is subject to gaussian noise or any other common noise distribution that is nonzero on a sufficiently large part of the state-space. We show that many regular languages cannot be recognized by networks of this type, and we give a precise characterization of languages that can be recognized. This result implies severe constraints on possibilities for constructing recurrent analog neural nets that are robust against realistic types of analog noise. On the other hand, we present a method for constructing feedfor-ward analog neural nets that are robust with regard to analog noise of this type.

Cite

Text

Maass and Sontag. "Analog Neural Nets with Gaussian or Other Common Noise Distribution Cannot Recognize Arbitrary Regular Languages." Neural Computation, 1999. doi:10.1162/089976699300016656

Markdown

[Maass and Sontag. "Analog Neural Nets with Gaussian or Other Common Noise Distribution Cannot Recognize Arbitrary Regular Languages." Neural Computation, 1999.](https://mlanthology.org/neco/1999/maass1999neco-analog/) doi:10.1162/089976699300016656

BibTeX

@article{maass1999neco-analog,
  title     = {{Analog Neural Nets with Gaussian or Other Common Noise Distribution Cannot Recognize Arbitrary Regular Languages}},
  author    = {Maass, Wolfgang and Sontag, Eduardo D.},
  journal   = {Neural Computation},
  year      = {1999},
  pages     = {771-782},
  doi       = {10.1162/089976699300016656},
  volume    = {11},
  url       = {https://mlanthology.org/neco/1999/maass1999neco-analog/}
}