Leveraging Title-Abstract Attentive Semantics for Paper Recommendation
Abstract
Paper recommendation is a research topic to provide users with personalized papers of interest. However, most existing approaches equally treat title and abstract as the input to learn the representation of a paper, ignoring their semantic relationship. In this paper, we regard the abstract as a sequence of sentences, and propose a two-level attentive neural network to capture: (1) the ability of each word within a sentence to reflect if it is semantically close to the words within the title. (2) the extent of each sentence in the abstract relative to the title, which is often a good summarization of the abstract document. Specifically, we propose a Long-Short Term Memory (LSTM) network with attention to learn the representation of sentences, and integrate a Gated Recurrent Unit (GRU) network with a memory network to learn the long-term sequential sentence patterns of interacted papers for both user and item (paper) modeling. We conduct extensive experiments on two real datasets, and show that our approach outperforms other state-of-the-art approaches in terms of accuracy.
Cite
Text
Guo et al. "Leveraging Title-Abstract Attentive Semantics for Paper Recommendation." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I01.5335Markdown
[Guo et al. "Leveraging Title-Abstract Attentive Semantics for Paper Recommendation." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/guo2020aaai-leveraging/) doi:10.1609/AAAI.V34I01.5335BibTeX
@inproceedings{guo2020aaai-leveraging,
title = {{Leveraging Title-Abstract Attentive Semantics for Paper Recommendation}},
author = {Guo, Guibing and Chen, Bowei and Zhang, Xiaoyan and Liu, Zhirong and Dong, Zhenhua and He, Xiuqiang},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {67-74},
doi = {10.1609/AAAI.V34I01.5335},
url = {https://mlanthology.org/aaai/2020/guo2020aaai-leveraging/}
}