Bayesian Learning of Joint Distributions of Objects
Abstract
There is increasing interest in broad application areas in defining flexible joint models for data having a variety of measurement scales, while also allowing data of complex types, such as functions, images and documents. We consider a general framework for nonparametric Bayes joint modeling through mixture models that incorporate dependence across data types through a joint mixing measure. The mixing measure is assigned a novel infinite tensor factorization (ITF) prior that allows flexible dependence in cluster allocation across data types. The ITF prior is formulated as a tensor product of stick-breaking processes. Focusing on a convenient special case corresponding to a Parafac factorization, we provide basic theory justifying the flexibility of the proposed prior and resulting asymptotic properties. Focusing on ITF mixtures of product kernels, we develop a new Gibbs sampling algorithm for routine implementation relying on slice sampling. The methods are compared with alternative joint mixture models based on Dirichlet processes and related approaches through simulations and real data applications.
Cite
Text
Banerjee et al. "Bayesian Learning of Joint Distributions of Objects." International Conference on Artificial Intelligence and Statistics, 2013.Markdown
[Banerjee et al. "Bayesian Learning of Joint Distributions of Objects." International Conference on Artificial Intelligence and Statistics, 2013.](https://mlanthology.org/aistats/2013/banerjee2013aistats-bayesian/)BibTeX
@inproceedings{banerjee2013aistats-bayesian,
title = {{Bayesian Learning of Joint Distributions of Objects}},
author = {Banerjee, Anjishnu and Murray, Jared and Dunson, David B.},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2013},
pages = {1-9},
url = {https://mlanthology.org/aistats/2013/banerjee2013aistats-bayesian/}
}