Sequence Modeling with Unconstrained Generation Order

Abstract

The dominant approach to sequence generation is to produce a sequence in some predefined order, e.g. left to right. In contrast, we propose a more general model that can generate the output sequence by inserting tokens in any arbitrary order. Our model learns decoding order as a result of its training procedure. Our experiments show that this model is superior to fixed order models on a number of sequence generation tasks, such as Machine Translation, Image-to-LaTeX and Image Captioning.

Cite

Text

Emelianenko et al. "Sequence Modeling with Unconstrained Generation Order." Neural Information Processing Systems, 2019.

Markdown

[Emelianenko et al. "Sequence Modeling with Unconstrained Generation Order." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/emelianenko2019neurips-sequence/)

BibTeX

@inproceedings{emelianenko2019neurips-sequence,
  title     = {{Sequence Modeling with Unconstrained Generation Order}},
  author    = {Emelianenko, Dmitrii and Voita, Elena and Serdyukov, Pavel},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {7700-7711},
  url       = {https://mlanthology.org/neurips/2019/emelianenko2019neurips-sequence/}
}