Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections
Abstract
Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies. At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning. We use a classical mathematical object -- the free algebra -- to capture this non-commutativity. To address the innate computational complexity of this algebra, we use compositions of low-rank tensor projections. This yields modular and scalable building blocks that give state-of-the-art performance on standard benchmarks such as multivariate time series classification, mortality prediction and generative models for video.
Cite
Text
Toth et al. "Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections." International Conference on Learning Representations, 2021.Markdown
[Toth et al. "Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/toth2021iclr-seq2tens/)BibTeX
@inproceedings{toth2021iclr-seq2tens,
title = {{Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections}},
author = {Toth, Csaba and Bonnier, Patric and Oberhauser, Harald},
booktitle = {International Conference on Learning Representations},
year = {2021},
url = {https://mlanthology.org/iclr/2021/toth2021iclr-seq2tens/}
}