End-to-End Quantum-like Language Models with Application to Question Answering
Abstract
Language Modeling (LM) is a fundamental research topic in a range of areas. Recently, inspired by quantum theory, a novel Quantum Language Model (QLM) has been proposed for Information Retrieval (IR). In this paper, we aim to broaden the theoretical and practical basis of QLM. We develop a Neural Network based Quantum-like Language Model (NNQLM) and apply it to Question Answering. Specifically, based on word embeddings, we design a new density matrix, which represents a sentence (e.g., a question or an answer) and encodes a mixture of semantic subspaces. Such a density matrix, together with a joint representation of the question and the answer, can be integrated into neural network architectures (e.g., 2-dimensional convolutional neural networks). Experiments on the TREC-QA and WIKIQA datasets have verified the effectiveness of our proposed models.
Cite
Text
Zhang et al. "End-to-End Quantum-like Language Models with Application to Question Answering." AAAI Conference on Artificial Intelligence, 2018. doi:10.1609/AAAI.V32I1.11979Markdown
[Zhang et al. "End-to-End Quantum-like Language Models with Application to Question Answering." AAAI Conference on Artificial Intelligence, 2018.](https://mlanthology.org/aaai/2018/zhang2018aaai-end-a/) doi:10.1609/AAAI.V32I1.11979BibTeX
@inproceedings{zhang2018aaai-end-a,
title = {{End-to-End Quantum-like Language Models with Application to Question Answering}},
author = {Zhang, Peng and Niu, Jiabin and Su, Zhan and Wang, Benyou and Ma, Liqun and Song, Dawei},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2018},
pages = {5666-5673},
doi = {10.1609/AAAI.V32I1.11979},
url = {https://mlanthology.org/aaai/2018/zhang2018aaai-end-a/}
}