Stochastic Neural Networks with Monotonic Activation Functions

Abstract

We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.

Cite

Text

Ravanbakhsh et al. "Stochastic Neural Networks with Monotonic Activation Functions." International Conference on Artificial Intelligence and Statistics, 2016.

Markdown

[Ravanbakhsh et al. "Stochastic Neural Networks with Monotonic Activation Functions." International Conference on Artificial Intelligence and Statistics, 2016.](https://mlanthology.org/aistats/2016/ravanbakhsh2016aistats-stochastic/)

BibTeX

@inproceedings{ravanbakhsh2016aistats-stochastic,
  title     = {{Stochastic Neural Networks with Monotonic Activation Functions}},
  author    = {Ravanbakhsh, Siamak and Póczos, Barnabás and Schneider, Jeff G. and Schuurmans, Dale and Greiner, Russell},
  booktitle = {International Conference on Artificial Intelligence and Statistics},
  year      = {2016},
  pages     = {809-818},
  url       = {https://mlanthology.org/aistats/2016/ravanbakhsh2016aistats-stochastic/}
}