An Analysis of Tensor Models for Learning on Structured Data
Abstract
While tensor factorizations have become increasingly popular for learning on various forms of structured data, only very few theoretical results exist on the generalization abilities of these methods. Here, we discuss the tensor product as a principled way to represent structured data in vector spaces for machine learning tasks. By extending known bounds for matrix factorizations, we are able to derive generalization error bounds for the tensor case. Furthermore, we analyze analytically and experimentally how tensor factorization behaves when applied to over- and understructured representations, for instance, when two-way tensor factorization, i.e. matrix factorization, is applied to three-way tensor data.
Cite
Text
Nickel and Tresp. "An Analysis of Tensor Models for Learning on Structured Data." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2013. doi:10.1007/978-3-642-40991-2_18Markdown
[Nickel and Tresp. "An Analysis of Tensor Models for Learning on Structured Data." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2013.](https://mlanthology.org/ecmlpkdd/2013/nickel2013ecmlpkdd-analysis/) doi:10.1007/978-3-642-40991-2_18BibTeX
@inproceedings{nickel2013ecmlpkdd-analysis,
title = {{An Analysis of Tensor Models for Learning on Structured Data}},
author = {Nickel, Maximilian and Tresp, Volker},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2013},
pages = {272-287},
doi = {10.1007/978-3-642-40991-2_18},
url = {https://mlanthology.org/ecmlpkdd/2013/nickel2013ecmlpkdd-analysis/}
}