OLMD: Orientation-Aware Long-Term Motion Decoupling for Continuous Sign Language Recognition

Abstract

The primary challenge in continuous sign language recognition (CSLR) mainly stems from the presence of multi-orientational and long-term motions. However, current research overlooks these crucial aspects, significantly impacting accuracy. To tackle these issues, we propose a novel CSLR framework: Orientation-aware Long-term Motion Decoupling (OLMD), which efficiently aggregates long-term motions and decouples multi-orientational signals into easily interpretable components. Specifically, our innovative Long-term Motion Aggregation (LMA) module filters out static redundancy while adaptively capturing abundant features of long-term motions. We further enhance orientation awareness by decoupling complex movements into horizontal and vertical components, allowing for motion purification in both orientations. Additionally, two coupling mechanisms are proposed: stage and cross-stage coupling, which together enrich multi-scale features and improve the generalization capabilities of the model. Experimentally, OLMD shows SOTA performance on three large-scale datasets: PHOENIX14, PHOENIX14-T, and CSL-Daily. Notably, we improve the word error rate (WER) on PHOENIX14 by an absolute 1.6% compared to the previous SOTA.

Cite

Text

Yu et al. "OLMD: Orientation-Aware Long-Term Motion Decoupling for Continuous Sign Language Recognition." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I9.33052

Markdown

[Yu et al. "OLMD: Orientation-Aware Long-Term Motion Decoupling for Continuous Sign Language Recognition." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/yu2025aaai-olmd/) doi:10.1609/AAAI.V39I9.33052

BibTeX

@inproceedings{yu2025aaai-olmd,
  title     = {{OLMD: Orientation-Aware Long-Term Motion Decoupling for Continuous Sign Language Recognition}},
  author    = {Yu, Yiheng and Liu, Sheng and Feng, Yuan and Xu, Min and Jin, Zhelun and Yang, Xuhua},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {9707-9715},
  doi       = {10.1609/AAAI.V39I9.33052},
  url       = {https://mlanthology.org/aaai/2025/yu2025aaai-olmd/}
}