Generalized Boltzmann Machine with Deep Neural Structure
Abstract
Restricted Boltzmann Machine (RBM) is an essential component in many machine learning applications. As a probabilistic graphical model, RBM posits a shallow structure, which makes it less capable of modeling real-world applications. In this paper, to bridge the gap between RBM and artificial neural network, we propose an energy-based probabilistic model that is more flexible on modeling continuous data. By introducing the pair-wise inverse autoregressive flow into RBM, we propose two generalized continuous RBMs which contain deep neural network structure to more flexibly track the practical data distribution while still keeping the inference tractable. In addition, we extend the generalized RBM structures into sequential setting to better model the stochastic process of time series. Performance improvements on probabilistic modeling and representation learning are demonstrated by the experiments on diverse datasets.
Cite
Text
Liu et al. "Generalized Boltzmann Machine with Deep Neural Structure." Artificial Intelligence and Statistics, 2019.Markdown
[Liu et al. "Generalized Boltzmann Machine with Deep Neural Structure." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/liu2019aistats-generalized/)BibTeX
@inproceedings{liu2019aistats-generalized,
title = {{Generalized Boltzmann Machine with Deep Neural Structure}},
author = {Liu, Yingru and Xie, Dongliang and Wang, Xin},
booktitle = {Artificial Intelligence and Statistics},
year = {2019},
pages = {926-934},
volume = {89},
url = {https://mlanthology.org/aistats/2019/liu2019aistats-generalized/}
}