A Precise Characterization of the Class of Languages Recognized by Neural Nets Under Gaussian and Other Common Noise Distributions
Abstract
We consider recurrent analog neural nets where each gate is subject to Gaussian noise, or any other common noise distribution whose probabil(cid:173) ity density function is nonzero on a large set. We show that many regular languages cannot be recognized by networks of this type, for example the language {w E O, I * I w begins with O}, and we give a precise characterization of those languages which can be recognized. This result implies severe constraints on possibilities for constructing recurrent ana(cid:173) log neural nets that are robust against realistic types of analog noise. On the other hand we present a method for constructing feed forward analog neural nets that are robust with regard to analog noise of this type.
Cite
Text
Maass and Sontag. "A Precise Characterization of the Class of Languages Recognized by Neural Nets Under Gaussian and Other Common Noise Distributions." Neural Information Processing Systems, 1998.Markdown
[Maass and Sontag. "A Precise Characterization of the Class of Languages Recognized by Neural Nets Under Gaussian and Other Common Noise Distributions." Neural Information Processing Systems, 1998.](https://mlanthology.org/neurips/1998/maass1998neurips-precise/)BibTeX
@inproceedings{maass1998neurips-precise,
title = {{A Precise Characterization of the Class of Languages Recognized by Neural Nets Under Gaussian and Other Common Noise Distributions}},
author = {Maass, Wolfgang and Sontag, Eduardo D.},
booktitle = {Neural Information Processing Systems},
year = {1998},
pages = {281-287},
url = {https://mlanthology.org/neurips/1998/maass1998neurips-precise/}
}