EP-GIG Priors and Applications in Bayesian Sparse Learning

Abstract

In this paper we propose a novel framework for the construction of sparsity-inducing priors. In particular, we define such priors as a mixture of exponential power distributions with a generalized inverse Gaussian density (EP-GIG). EP-GIG is a variant of generalized hyperbolic distributions, and the special cases include Gaussian scale mixtures and Laplace scale mixtures. Furthermore, Laplace scale mixtures can subserve a Bayesian framework for sparse learning with nonconvex penalization. The densities of EP-GIG can be explicitly expressed. Moreover, the corresponding posterior distribution also follows a generalized inverse Gaussian distribution. We exploit these properties to develop EM algorithms for sparse empirical Bayesian learning. We also show that these algorithms bear an interesting resemblance to iteratively reweighted l2 or l1 methods. Finally, we present two extensions for grouped variable selection and logistic regression.

Cite

Text

Zhang et al. "EP-GIG Priors and Applications in Bayesian Sparse Learning." Journal of Machine Learning Research, 2012.

Markdown

[Zhang et al. "EP-GIG Priors and Applications in Bayesian Sparse Learning." Journal of Machine Learning Research, 2012.](https://mlanthology.org/jmlr/2012/zhang2012jmlr-epgig/)

BibTeX

@article{zhang2012jmlr-epgig,
  title     = {{EP-GIG Priors and Applications in Bayesian Sparse Learning}},
  author    = {Zhang, Zhihua and Wang, Shusen and Liu, Dehua and Jordan, Michael I.},
  journal   = {Journal of Machine Learning Research},
  year      = {2012},
  pages     = {2031-2061},
  volume    = {13},
  url       = {https://mlanthology.org/jmlr/2012/zhang2012jmlr-epgig/}
}