White-Box Error Correction Code Transformer
Abstract
Error correcting codes (ECCs) play a crucial role in modern communication systems by ensuring reliable data transmission over noisy channels. While traditional algorithms based on belief propagation suffer from limited decoding performance, transformer-based approaches have emerged as powerful solutions for ECC decoding. However, the internal mechanisms of transformer-based approaches remain largely unexplained, making it challenging to understand and improve their performance. In this paper, we propose a White-box Error Correction Code Transformer (WECCT) that provides theoretical insights into transformer-based decoding. By formulating the decoding problem from a sparse rate reduction perspective and introducing a novel Multi-head Tanner-subspaces Self Attention mechanism, our approach provides a parameter-efficient and theoretically principled framework for understanding transformer-based decoding. Extensive experiments across various code families demonstrate that this interpretable design achieves competitive performance compared to state-of-the-art decoders.
Cite
Text
Zheng et al. "White-Box Error Correction Code Transformer." Conference on Parsimony and Learning, 2025.Markdown
[Zheng et al. "White-Box Error Correction Code Transformer." Conference on Parsimony and Learning, 2025.](https://mlanthology.org/cpal/2025/zheng2025cpal-whitebox/)BibTeX
@inproceedings{zheng2025cpal-whitebox,
title = {{White-Box Error Correction Code Transformer}},
author = {Zheng, Ziyan and Lau, Chin Wa and Guo, Nian and Shi, Xiang and Huang, Shao-Lun},
booktitle = {Conference on Parsimony and Learning},
year = {2025},
pages = {1292-1306},
volume = {280},
url = {https://mlanthology.org/cpal/2025/zheng2025cpal-whitebox/}
}