Hyena Hierarchy: Towards Larger Convolutional Language Models
Abstract
Recent advances in deep learning have relied heavily on the use of large Transformers due to their ability to learn at scale. However, the core building block of Transformers, the attention operator, exhibits quadratic cost in sequence length, limiting the amount of context accessible. Existing subquadratic methods based on low-rank and sparse approximations need to be combined with dense attention layers to match Transformers at scale, indicating a gap in capability. In this work, we propose Hyena, a subquadratic drop-in replacement for attention constructed by interleaving implicitly parametrized long convolutions and data-controlled gating. In challenging reasoning tasks on sequences of thousands to hundreds of thousands of tokens, Hyena improves accuracy by more than 50 points over operators relying on state-space models, transfer functions, and other implicit and explicit methods, matching attention-based models. We set a new state-of-the-art for dense-attention-free architectures on language modeling in standard datasets WikiText103 and The Pile, reaching Transformer quality with a 20% reduction in training compute required at sequence length 2k. Hyena operators are 2x faster than highly optimized attention at sequence length 8k, with speedups of 100x at 64k.
Cite
Text
Poli et al. "Hyena Hierarchy: Towards Larger Convolutional Language Models." International Conference on Machine Learning, 2023.Markdown
[Poli et al. "Hyena Hierarchy: Towards Larger Convolutional Language Models." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/poli2023icml-hyena/)BibTeX
@inproceedings{poli2023icml-hyena,
title = {{Hyena Hierarchy: Towards Larger Convolutional Language Models}},
author = {Poli, Michael and Massaroli, Stefano and Nguyen, Eric and Fu, Daniel Y and Dao, Tri and Baccus, Stephen and Bengio, Yoshua and Ermon, Stefano and Re, Christopher},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {28043-28078},
volume = {202},
url = {https://mlanthology.org/icml/2023/poli2023icml-hyena/}
}