Uni-Sign: Toward Unified Sign Language Understanding at Scale

Abstract

Sign language pre-training has gained increasing attention for its ability to enhance performance across various sign language understanding (SLU) tasks. However, existing methods often suffer from a gap between pre-training and fine-tuning, leading to suboptimal results. To address this, we propose Uni-Sign, a unified pre-training framework that eliminates the gap between pre-training and downstream SLU tasks through a large-scale generative pre-training strategy and a novel fine-tuning paradigm. First, we introduce CSL-News, a large-scale Chinese Sign Language (CSL) dataset containing 1,985 hours of video paired with textual annotations, which enables effective large-scale pre-training. Second, Uni-Sign unifies SLU tasks by treating downstream tasks as a single sign language translation (SLT) task during fine-tuning, ensuring seamless knowledge transfer between pre-training and fine-tuning. Furthermore, we incorporate a prior-guided fusion (PGF) module and a score-aware sampling strategy to efficiently fuse pose and RGB information, addressing keypoint inaccuracies and improving computational efficiency. Extensive experiments across multiple SLU benchmarks demonstrate that Uni-Sign achieves state-of-the-art performance across multiple downstream SLU tasks. Dataset and code are available at github.com/ZechengLi19/Uni-Sign.

Cite

Text

Li et al. "Uni-Sign: Toward Unified Sign Language Understanding at Scale." International Conference on Learning Representations, 2025.

Markdown

[Li et al. "Uni-Sign: Toward Unified Sign Language Understanding at Scale." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/li2025iclr-unisign/)

BibTeX

@inproceedings{li2025iclr-unisign,
  title     = {{Uni-Sign: Toward Unified Sign Language Understanding at Scale}},
  author    = {Li, Zecheng and Zhou, Wengang and Zhao, Weichao and Wu, Kepeng and Hu, Hezhen and Li, Houqiang},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/li2025iclr-unisign/}
}