Inference with Multivariate Heavy-Tails in Linear Models

Abstract

Heavy-tailed distributions naturally occur in many real life problems. Unfortunately, it is typically not possible to compute inference in closed-form in graphical models which involve such heavy tailed distributions. In this work, we propose a novel simple linear graphical model for independent latent random variables, called linear characteristic model (LCM), defined in the characteristic function domain. Using stable distributions, a heavy-tailed family of distributions which is a generalization of Cauchy, L\'evy and Gaussian distributions, we show for the first time, how to compute both exact and approximate inference in such a linear multivariate graphical model. LCMs are not limited to only stable distributions, in fact LCMs are always defined for any random variables (discrete, continuous or a mixture of both). We provide a realistic problem from the field of computer networks to demonstrate the applicability of our construction. Other potential application is iterative decoding of linear channels with non-Gaussian noise.

Cite

Text

Bickson and Guestrin. "Inference with Multivariate Heavy-Tails in Linear Models." Neural Information Processing Systems, 2010.

Markdown

[Bickson and Guestrin. "Inference with Multivariate Heavy-Tails in Linear Models." Neural Information Processing Systems, 2010.](https://mlanthology.org/neurips/2010/bickson2010neurips-inference/)

BibTeX

@inproceedings{bickson2010neurips-inference,
  title     = {{Inference with Multivariate Heavy-Tails in Linear Models}},
  author    = {Bickson, Danny and Guestrin, Carlos},
  booktitle = {Neural Information Processing Systems},
  year      = {2010},
  pages     = {208-216},
  url       = {https://mlanthology.org/neurips/2010/bickson2010neurips-inference/}
}