Thurstonian Boltzmann Machines: Learning from Multiple Inequalities
Abstract
We introduce Thurstonian Boltzmann Machines (TBM), a unified architecture that can naturally incorporate a wide range of data inputs at the same time. Our motivation rests in the Thurstonian view that many discrete data types can be considered as being generated from a subset of underlying latent continuous variables, and in the observation that each realisation of a discrete type imposes certain inequalities on those variables. Thus learning and inference in TBM reduce to making sense of a set of inequalities. Our proposed TBM naturally supports the following types: Gaussian, intervals, censored, binary, categorical, muticategorical, ordinal, (in)-complete rank with and without ties. We demonstrate the versatility and capacity of the proposed model on three applications of very different natures; namely handwritten digit recognition, collaborative filtering and complex social survey analysis.
Cite
Text
Tran et al. "Thurstonian Boltzmann Machines: Learning from Multiple Inequalities." International Conference on Machine Learning, 2013.Markdown
[Tran et al. "Thurstonian Boltzmann Machines: Learning from Multiple Inequalities." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/tran2013icml-thurstonian/)BibTeX
@inproceedings{tran2013icml-thurstonian,
title = {{Thurstonian Boltzmann Machines: Learning from Multiple Inequalities}},
author = {Tran, Truyen and Phung, Dinh and Venkatesh, Svetha},
booktitle = {International Conference on Machine Learning},
year = {2013},
pages = {46-54},
volume = {28},
url = {https://mlanthology.org/icml/2013/tran2013icml-thurstonian/}
}