Attention-Based Conditional Random Field for Financial Fraud Detection
Abstract
Financial fraud detection is critical for market transparency and regulatory compliance. Existing methods often ignore the temporal patterns in financial data, which are essential for understanding dynamic financial behaviors and detecting fraud. Moreover, they also treat companies as independent entities, overlooking the valuable interrelationships. To address these issues, we propose ACRF-RNN, a Recurrent Neural Network (RNN) with Attention-based Conditional Random Field (CRF) for fraud detection. Specifically, we use an RNN with a sliding window to capture temporal dependencies from historical data, and an attention-based CRF feature transformer to model inter-company relationships. This transforms raw financial data into optimized features, fed into a multi-layer perceptron for classification. Besides, we also use the focal loss to alleviate the class imbalance problem caused by rare fraudulent cases. This work presents a novel real-world dataset to evaluate the performance of ACRF-RNN. Extensive experiments show that ACRF-RNN outperforms the state-of-the-art methods by 15.28% in KS and 4.04% in Recall. Data and code are available at: https://github.com/XNetLab/ACRF-RNN.git.
Cite
Text
Wang et al. "Attention-Based Conditional Random Field for Financial Fraud Detection." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/870Markdown
[Wang et al. "Attention-Based Conditional Random Field for Financial Fraud Detection." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/wang2025ijcai-attention/) doi:10.24963/IJCAI.2025/870BibTeX
@inproceedings{wang2025ijcai-attention,
title = {{Attention-Based Conditional Random Field for Financial Fraud Detection}},
author = {Wang, Xiaoguang and Wang, Chenxu and Zhang, Luyue and Wang, Xiaole and Wang, Mengqin and Liu, Huanlong and Qin, Tao},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2025},
pages = {7822-7830},
doi = {10.24963/IJCAI.2025/870},
url = {https://mlanthology.org/ijcai/2025/wang2025ijcai-attention/}
}