Incorporating Context-Relevant Knowledge into Convolutional Neural Networks for Short Text Classification
Abstract
Some text classification methods don’t work well on short texts due to the data sparsity. What’s more, they don’t fully exploit context-relevant knowledge. In order to tackle these problems, we propose a neural network to incorporate context-relevant knowledge into a convolutional neural network for short text classification. Our model consists of two modules. The first module utilizes two layers to extract concept and context features respectively and then employs an attention layer to extract those context-relevant concepts. The second module utilizes a convolutional neural network to extract high-level features from the word and the contextrelevant concept features. The experimental results on three datasets show that our proposed model outperforms the stateof-the-art models.
Cite
Text
Xu and Cai. "Incorporating Context-Relevant Knowledge into Convolutional Neural Networks for Short Text Classification." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.330110067Markdown
[Xu and Cai. "Incorporating Context-Relevant Knowledge into Convolutional Neural Networks for Short Text Classification." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/xu2019aaai-incorporating/) doi:10.1609/AAAI.V33I01.330110067BibTeX
@inproceedings{xu2019aaai-incorporating,
title = {{Incorporating Context-Relevant Knowledge into Convolutional Neural Networks for Short Text Classification}},
author = {Xu, Jingyun and Cai, Yi},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2019},
pages = {10067-10068},
doi = {10.1609/AAAI.V33I01.330110067},
url = {https://mlanthology.org/aaai/2019/xu2019aaai-incorporating/}
}