Charformer: Fast Character Transformers via Gradient-Based Subword Tokenization

Abstract

State-of-the-art models in natural language processing rely on separate rigid subword tokenization algorithms, which limit their generalization ability and adaptation to new settings. In this paper, we propose a new model inductive bias that learns a subword tokenization end-to-end as part of the model. To this end, we introduce a soft gradient-based subword tokenization module (GBST) that automatically learns latent subword representations from characters in a data-driven fashion. Concretely, GBST enumerates candidate subword blocks and learns to score them in a position-wise fashion using a block scoring network. We additionally introduce Charformer, a deep Transformer model that integrates GBST and operates on the character level. Via extensive experiments on English GLUE, multilingual, and noisy text datasets, we show that Charformer outperforms a series of competitive character-level baselines while generally performing on par and sometimes outperforming subword-based models. Additionally, Charformer is fast, improving the speed of vanilla character-level Transformers by up to while maintaining quality. We believe this work paves the way for highly performant token-free models that are trained completely end-to-end.

Cite

Text

Tay et al. "Charformer: Fast Character Transformers via Gradient-Based Subword Tokenization." International Conference on Learning Representations, 2022.

Markdown

[Tay et al. "Charformer: Fast Character Transformers via Gradient-Based Subword Tokenization." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/tay2022iclr-charformer/)

BibTeX

@inproceedings{tay2022iclr-charformer,
  title     = {{Charformer: Fast Character Transformers via Gradient-Based Subword Tokenization}},
  author    = {Tay, Yi and Tran, Vinh Q. and Ruder, Sebastian and Gupta, Jai and Chung, Hyung Won and Bahri, Dara and Qin, Zhen and Baumgartner, Simon and Yu, Cong and Metzler, Donald},
  booktitle = {International Conference on Learning Representations},
  year      = {2022},
  url       = {https://mlanthology.org/iclr/2022/tay2022iclr-charformer/}
}