AdmTree: Compressing Lengthy Context with Adaptive Semantic Trees
Abstract
The quadratic complexity of self-attention limits Large Language Models (LLMs) in processing long contexts, a capability vital for many advanced applications. Context compression aims to mitigate this computational barrier while preserving essential semantic information. However, existing methods often falter: explicit methods can sacrifice local detail, while implicit ones may exhibit positional biases, struggle with information degradation, or fail to capture long-range semantic dependencies. We introduce AdmTree, a novel framework for adaptive, hierarchical context compression designed with a core focus on maintaining high semantic fidelity while keep efficiency. AdmTree dynamically segments input based on information density, employing gist tokens to summarize variable-length segments as leaves in a semantic binary tree. This structure, combined with a lightweight aggregation mechanism and a frozen backbone LLM (minimizing new trainable parameters), enables efficient hierarchical abstraction of the context. By effectively preserving fine-grained details alongside global semantic coherence, mitigating position bias, and adapting dynamically to content, AdmTree comprehensively preserves the semantic information of lengthy context.
Cite
Text
Li et al. "AdmTree: Compressing Lengthy Context with Adaptive Semantic Trees." Advances in Neural Information Processing Systems, 2025.Markdown
[Li et al. "AdmTree: Compressing Lengthy Context with Adaptive Semantic Trees." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/li2025neurips-admtree/)BibTeX
@inproceedings{li2025neurips-admtree,
title = {{AdmTree: Compressing Lengthy Context with Adaptive Semantic Trees}},
author = {Li, Yangning and Chen, Shaoshen and Li, Yinghui and Chen, Yankai and Zheng, Hai-Tao and Wang, Hui and Jiang, Wenhao and Yu, Philip S.},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/li2025neurips-admtree/}
}