Training Restricted Boltzmann Machines with Overlapping Partitions

Abstract

Restricted Boltzmann Machines (RBM) are energy-based models that are successfully used as generative learning models as well as crucial components of Deep Belief Networks (DBN). The most successful training method to date for RBMs is the Contrastive Divergence method. However, Contrastive Divergence is inefficient when the number of features is very high and the mixing rate of the Gibbs chain is slow. We propose a new training method that partitions a single RBM into multiple overlapping small RBMs. The final RBM is learned by layers of partitions. We show that this method is not only fast, it is also more accurate in terms of its generative power.

Cite

Text

Tosun and Sheppard. "Training Restricted Boltzmann Machines with Overlapping Partitions." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2014. doi:10.1007/978-3-662-44845-8_13

Markdown

[Tosun and Sheppard. "Training Restricted Boltzmann Machines with Overlapping Partitions." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2014.](https://mlanthology.org/ecmlpkdd/2014/tosun2014ecmlpkdd-training/) doi:10.1007/978-3-662-44845-8_13

BibTeX

@inproceedings{tosun2014ecmlpkdd-training,
  title     = {{Training Restricted Boltzmann Machines with Overlapping Partitions}},
  author    = {Tosun, Hasari and Sheppard, John W.},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2014},
  pages     = {195-208},
  doi       = {10.1007/978-3-662-44845-8_13},
  url       = {https://mlanthology.org/ecmlpkdd/2014/tosun2014ecmlpkdd-training/}
}