Recurrent Convolutional Neural Networks for Text Classification

Abstract

Text classification is a foundational task in many NLP applications. Traditional text classifiers often rely on many human-designed features, such as dictionaries, knowledge bases and special tree kernels. In contrast to traditional methods, we introduce a recurrent convolutional neural network for text classification without human-designed features. In our model, we apply a recurrent structure to capture contextual information as far as possible when learning word representations, which may introduce considerably less noise compared to traditional window-based neural networks. We also employ a max-pooling layer that automatically judges which words play key roles in text classification to capture the key components in texts. We conduct experiments on four commonly used datasets. The experimental results show that the proposed method outperforms the state-of-the-art methods on several datasets, particularly on document-level datasets.

Cite

Text

Lai et al. "Recurrent Convolutional Neural Networks for Text Classification." AAAI Conference on Artificial Intelligence, 2015. doi:10.1609/AAAI.V29I1.9513

Markdown

[Lai et al. "Recurrent Convolutional Neural Networks for Text Classification." AAAI Conference on Artificial Intelligence, 2015.](https://mlanthology.org/aaai/2015/lai2015aaai-recurrent/) doi:10.1609/AAAI.V29I1.9513

BibTeX

@inproceedings{lai2015aaai-recurrent,
  title     = {{Recurrent Convolutional Neural Networks for Text Classification}},
  author    = {Lai, Siwei and Xu, Liheng and Liu, Kang and Zhao, Jun},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {2267-2273},
  doi       = {10.1609/AAAI.V29I1.9513},
  url       = {https://mlanthology.org/aaai/2015/lai2015aaai-recurrent/}
}