ACT: An Attentive Convolutional Transformer for Efficient Text Classification
Abstract
Recently, Transformer has been demonstrating promising performance in many NLP tasks and showing a trend of replacing Recurrent Neural Network (RNN). Meanwhile, less attention is drawn to Convolutional Neural Network (CNN) due to its weak ability in capturing sequential and long-distance dependencies, although it has excellent local feature extraction capability. In this paper, we introduce an Attentive Convolutional Transformer (ACT) that takes the advantages of both Transformer and CNN for efficient text classification. Specifically, we propose a novel attentive convolution mechanism that utilizes the semantic meaning of convolutional filters attentively to transform text from complex word space to a more informative convolutional filter space where important n-grams are captured. ACT is able to capture both local and global dependencies effectively while preserving sequential information. Experiments on various text classification tasks and detailed analyses show that ACT is a lightweight, fast, and effective universal text classifier, outperforming CNNs, RNNs, and attentive models including Transformer.
Cite
Text
Li et al. "ACT: An Attentive Convolutional Transformer for Efficient Text Classification." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I15.17566Markdown
[Li et al. "ACT: An Attentive Convolutional Transformer for Efficient Text Classification." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/li2021aaai-act/) doi:10.1609/AAAI.V35I15.17566BibTeX
@inproceedings{li2021aaai-act,
title = {{ACT: An Attentive Convolutional Transformer for Efficient Text Classification}},
author = {Li, Pengfei and Zhong, Peixiang and Mao, Kezhi and Wang, Dongzhe and Yang, Xuefeng and Liu, Yunfeng and Yin, Jianxiong and See, Simon},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2021},
pages = {13261-13269},
doi = {10.1609/AAAI.V35I15.17566},
url = {https://mlanthology.org/aaai/2021/li2021aaai-act/}
}