Rope to Nope and Back Again: A New Hybrid Attention Strategy

Abstract

Long-context large language models (LLMs) have achieved remarkable advancements, driven by techniques like Rotary Position Embedding (RoPE) (Su et al., 2023) and its extensions (Chen et al., 2023; Liu et al., 2024c; Peng et al., 2023). By adjusting RoPE parameters and incorporating training data with extended contexts, we can train performant models with considerably longer input sequences. However, existing RoPE-based methods exhibit performance limitations when applied to extended context lengths. This paper presents a comprehensive analysis of various attention mechanisms, including RoPE, No Positional Embedding (NoPE), and Query-Key Normalization (QK-Norm), identifying their strengths and shortcomings in long-context modeling. Our investigation identifies distinctive attention patterns in these methods and highlights their impact on long-context performance, providing valuable insights for architectural design. on long context performance, providing valuable insights for architectural design. Building on these findings, we propose a novel architecture featuring a hybrid attention mechanism that integrates global and local attention spans. This design not only surpasses conventional RoPE-based transformer models with full attention in both long and short context tasks but also delivers substantial efficiency gains during training and inference.

Cite

Text

Yang et al. "Rope to Nope and Back Again: A New Hybrid Attention Strategy." Advances in Neural Information Processing Systems, 2025.

Markdown

[Yang et al. "Rope to Nope and Back Again: A New Hybrid Attention Strategy." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/yang2025neurips-rope/)

BibTeX

@inproceedings{yang2025neurips-rope,
  title     = {{Rope to Nope and Back Again: A New Hybrid Attention Strategy}},
  author    = {Yang, Bowen and Venkitesh, Bharat and Gnaneshwar, Dwaraknath and Lin, Hangyu and Cairuz, David and Blunsom, Phil and Locatelli, Acyr},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/yang2025neurips-rope/}
}