Large Scale Variational Bayesian Inference for Structured Scale Mixture Models

Abstract

Natural image statistics exhibit hierarchical dependencies across multiple scales. Representing such prior knowledge in nonfactorial latent tree models can boost performance of image denoising, inpainting, deconvolution or reconstruction substantially, beyond standard factorial "sparse" methodology. We derive a large scale approximate Bayesian inference algorithm for linear models with nonfactorial (latent tree-structured) scale mixture priors. Experimental results on a range of denoising and inpainting problems demonstrate substantially improved performance compared to MAP estimation or to inference with factorial priors.

Cite

Text

Ko and Seeger. "Large Scale Variational Bayesian Inference for Structured Scale Mixture Models." International Conference on Machine Learning, 2012.

Markdown

[Ko and Seeger. "Large Scale Variational Bayesian Inference for Structured Scale Mixture Models." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/ko2012icml-large/)

BibTeX

@inproceedings{ko2012icml-large,
  title     = {{Large Scale Variational Bayesian Inference for Structured Scale Mixture Models}},
  author    = {Ko, Young-Jun and Seeger, Matthias W.},
  booktitle = {International Conference on Machine Learning},
  year      = {2012},
  url       = {https://mlanthology.org/icml/2012/ko2012icml-large/}
}