HELM: Highly Efficient Learning of Mixed Copula Networks
Abstract
Learning the structure of probabilistic graphi-cal models for complex real-valued domains is a formidable computational challenge. This in-evitably leads to significant modelling compro-mises such as discretization or the use of a sim-plistic Gaussian representation. In this work we address the challenge of efficiently learning truly expressive copula-based networks that facilitate a mix of varied copula families within the same model. Our approach is based on a simple but powerful bivariate building block that is used to highly efficiently perform local model selection, thus bypassing much of computational burden in-volved in structure learning. We show how this building block can be used to learn general net-works and demonstrate its effectiveness on var-ied and sizeable real-life domains. Importantly, favorable identification and generalization per-formance come with dramatic runtime improve-ments. Indeed, the benefits are such that they allow us to tackle domains that are prohibitive when using a standard learning approaches. 1
Cite
Text
Tenzer and Elidan. "HELM: Highly Efficient Learning of Mixed Copula Networks." Conference on Uncertainty in Artificial Intelligence, 2014.Markdown
[Tenzer and Elidan. "HELM: Highly Efficient Learning of Mixed Copula Networks." Conference on Uncertainty in Artificial Intelligence, 2014.](https://mlanthology.org/uai/2014/tenzer2014uai-helm/)BibTeX
@inproceedings{tenzer2014uai-helm,
title = {{HELM: Highly Efficient Learning of Mixed Copula Networks}},
author = {Tenzer, Yaniv and Elidan, Gal},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2014},
pages = {790-799},
url = {https://mlanthology.org/uai/2014/tenzer2014uai-helm/}
}