Quantitative Universal Approximation Bounds for Deep Belief Networks

Abstract

We show that deep belief networks with binary hidden units can approximate any multivariate probability density under very mild integrability requirements on the parental density of the visible nodes. The approximation is measured in the $L^q$-norm for $q\in[1,\infty]$ ($q=\infty$ corresponding to the supremum norm) and in Kullback-Leibler divergence. Furthermore, we establish sharp quantitative bounds on the approximation error in terms of the number of hidden units.

Cite

Text

Sieber and Gehringer. "Quantitative Universal Approximation Bounds for Deep Belief Networks." International Conference on Machine Learning, 2023.

Markdown

[Sieber and Gehringer. "Quantitative Universal Approximation Bounds for Deep Belief Networks." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/sieber2023icml-quantitative/)

BibTeX

@inproceedings{sieber2023icml-quantitative,
  title     = {{Quantitative Universal Approximation Bounds for Deep Belief Networks}},
  author    = {Sieber, Julian and Gehringer, Johann},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {31773-31787},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/sieber2023icml-quantitative/}
}