An Adaptive Empirical Bayesian Method for Sparse Deep Learning

Abstract

We propose a novel adaptive empirical Bayesian (AEB) method for sparse deep learning, where the sparsity is ensured via a class of self-adaptive spike-and-slab priors. The proposed method works by alternatively sampling from an adaptive hierarchical posterior distribution using stochastic gradient Markov Chain Monte Carlo (MCMC) and smoothly optimizing the hyperparameters using stochastic approximation (SA). The convergence of the proposed method to the asymptotically correct distribution is established under mild conditions. Empirical applications of the proposed method lead to the state-of-the-art performance on MNIST and Fashion MNIST with shallow convolutional neural networks (CNN) and the state-of-the-art compression performance on CIFAR10 with Residual Networks. The proposed method also improves resistance to adversarial attacks.

Cite

Text

Deng et al. "An Adaptive Empirical  Bayesian Method for Sparse Deep Learning." Neural Information Processing Systems, 2019.

Markdown

[Deng et al. "An Adaptive Empirical  Bayesian Method for Sparse Deep Learning." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/deng2019neurips-adaptive/)

BibTeX

@inproceedings{deng2019neurips-adaptive,
  title     = {{An Adaptive Empirical  Bayesian Method for Sparse Deep Learning}},
  author    = {Deng, Wei and Zhang, Xiao and Liang, Faming and Lin, Guang},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {5563-5573},
  url       = {https://mlanthology.org/neurips/2019/deng2019neurips-adaptive/}
}