Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties

Abstract

Problems of probabilistic inference and decision making under uncertainty commonly involve continuous random variables. Often these are discretized to a few points, to simplify assessments and computations. An alternative approximation is to fit analytically tractable continuous probability distributions. This approach has potential simplicity and accuracy advantages, especially if variables can be transformed first. This paper shows how a minimum relative entropy criterion can drive both transformation and fitting, illustrating with a power and logarithm family of transformations and mixtures of Gaussian (normal) distributions, which allow use of efficient influence diagram methods. The fitting procedure in this case is the well-known EM algorithm. The selection of the number of components in a fitted mixture distribution is automated with an objective that trades off accuracy and computational cost.

Cite

Text

Poland and Shachter. "Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties." Conference on Uncertainty in Artificial Intelligence, 1993. doi:10.1016/B978-1-4832-1451-1.50027-5

Markdown

[Poland and Shachter. "Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties." Conference on Uncertainty in Artificial Intelligence, 1993.](https://mlanthology.org/uai/1993/poland1993uai-mixtures/) doi:10.1016/B978-1-4832-1451-1.50027-5

BibTeX

@inproceedings{poland1993uai-mixtures,
  title     = {{Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties}},
  author    = {Poland, William B. and Shachter, Ross D.},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {1993},
  pages     = {183-190},
  doi       = {10.1016/B978-1-4832-1451-1.50027-5},
  url       = {https://mlanthology.org/uai/1993/poland1993uai-mixtures/}
}