Neural Networks and Rational Functions

Abstract

Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU network, there exists a rational function of degree $O(polylog(1/\epsilon))$ which is $\epsilon$-close, and similarly for any rational function there exists a ReLU network of size $O(polylog(1/\epsilon))$ which is $\epsilon$-close. By contrast, polynomials need degree $\Omega(poly(1/\epsilon))$ to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.

Cite

Text

Telgarsky. "Neural Networks and Rational Functions." International Conference on Machine Learning, 2017.

Markdown

[Telgarsky. "Neural Networks and Rational Functions." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/telgarsky2017icml-neural/)

BibTeX

@inproceedings{telgarsky2017icml-neural,
  title     = {{Neural Networks and Rational Functions}},
  author    = {Telgarsky, Matus},
  booktitle = {International Conference on Machine Learning},
  year      = {2017},
  pages     = {3387-3393},
  volume    = {70},
  url       = {https://mlanthology.org/icml/2017/telgarsky2017icml-neural/}
}