LongRoPE2: Near-Lossless LLM Context Window Scaling

Abstract

LongRoPE2 is a novel approach that extends the effective context window of pre-trained large language models (LLMs) to the target length, while preserving the performance on the original shorter context window. This is achieved by three contributions: (1) a hypothesis that insufficient training in higher RoPE dimensions contributes to the persistent out-of-distribution (OOD) issues observed in existing methods; (2) an effective RoPE rescaling algorithm that adopts evolutionary search guided by "needle-driven" perplexity to address the insufficient training problem; (3) a mixed context window training approach that fine-tunes model weights to adopt rescaled RoPE for long-context sequences while preserving the short-context performance with the original RoPE. Extensive experiments on LLaMA3-8B and Phi3-mini-3.8B across various benchmarks validate the hypothesis and demonstrate the effectiveness of LongRoPE2. Remarkably, LongRoPE2 extends LLaMA3-8B to achieve a 128K effective context length while retaining over 98.5% of short-context performance, using only 10B tokens – 80x fewer than Meta’s approach, which fails to reach the target effective context length.

Cite

Text

Shang et al. "LongRoPE2: Near-Lossless LLM Context Window Scaling." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Shang et al. "LongRoPE2: Near-Lossless LLM Context Window Scaling." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/shang2025icml-longrope2/)

BibTeX

@inproceedings{shang2025icml-longrope2,
  title     = {{LongRoPE2: Near-Lossless LLM Context Window Scaling}},
  author    = {Shang, Ning and Zhang, Li Lyna and Wang, Siyuan and Zhang, Gaokai and Lopez, Gilsinia and Yang, Fan and Chen, Weizhu and Yang, Mao},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {54203-54218},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/shang2025icml-longrope2/}
}