Generalised Coupled Tensor Factorisation

Abstract

We derive algorithms for generalised tensor factorisation (GTF) by building upon the well-established theory of Generalised Linear Models. Our algorithms are general in the sense that we can compute arbitrary factorisations in a message passing framework, derived for a broad class of exponential family distributions including special cases such as Tweedie's distributions corresponding to $\beta$-divergences. By bounding the step size of the Fisher Scoring iteration of the GLM, we obtain general updates for real data and multiplicative updates for non-negative data. The GTF framework is, then extended easily to address the problems when multiple observed tensors are factorised simultaneously. We illustrate our coupled factorisation approach on synthetic data as well as on a musical audio restoration problem.

Cite

Text

Yılmaz et al. "Generalised Coupled Tensor Factorisation." Neural Information Processing Systems, 2011.

Markdown

[Yılmaz et al. "Generalised Coupled Tensor Factorisation." Neural Information Processing Systems, 2011.](https://mlanthology.org/neurips/2011/ylmaz2011neurips-generalised/)

BibTeX

@inproceedings{ylmaz2011neurips-generalised,
  title     = {{Generalised Coupled Tensor Factorisation}},
  author    = {Yılmaz, Kenan Y. and Cemgil, Ali T. and Simsekli, Umut},
  booktitle = {Neural Information Processing Systems},
  year      = {2011},
  pages     = {2151-2159},
  url       = {https://mlanthology.org/neurips/2011/ylmaz2011neurips-generalised/}
}