STable Permutation-Based Framework for Table Generation in Sequence-to-Sequence Models

Abstract

We present a permutation-based text-to-table neural framework that unifies diverse NLP tasks into table outputs. The framework uses a probabilistic approach during training, maximizing the expected log-likelihood across all random permutations of table content factorization. At the inference stage, we optimize model uncertainties and minimize error propagation by leveraging the model's ability to generate cells in any order. Our method accelerates inference by up to 4$\times$ on some datasets and improves text-to-table performance by up to 15\% over previous solutions, all while preserving output quality.

Cite

Text

Pietruszka et al. "STable Permutation-Based Framework for Table Generation in Sequence-to-Sequence Models." ICML 2023 Workshops: SPIGM, 2023.

Markdown

[Pietruszka et al. "STable Permutation-Based Framework for Table Generation in Sequence-to-Sequence Models." ICML 2023 Workshops: SPIGM, 2023.](https://mlanthology.org/icmlw/2023/pietruszka2023icmlw-stable/)

BibTeX

@inproceedings{pietruszka2023icmlw-stable,
  title     = {{STable Permutation-Based Framework for Table Generation in Sequence-to-Sequence Models}},
  author    = {Pietruszka, Michał and Turski, Michał and Borchmann, Łukasz and Dwojak, Tomasz and Pałka, Gabriela and Szyndler, Karolina and Jurkiewicz, Dawid and Garncarek, Łukasz},
  booktitle = {ICML 2023 Workshops: SPIGM},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/pietruszka2023icmlw-stable/}
}