Learning Concept Embeddings for Query Expansion by Quantum Entropy Minimization
Abstract
In web search, users queries are formulated using only few terms and term-matching retrieval functions could fail at retrieving relevant documents. Given a user query, the technique of query expansion (QE) consists in selecting related terms that could enhance the likelihood of retrieving relevant documents. Selecting such expansion terms is challenging and requires a computational framework capable of encoding complex semantic relationships. In this paper, we propose a novel method for learning, in a supervised way, semantic representations for words and phrases. By embedding queries and documents in special matrices, our model disposes of an increased representational power with respect to existing approaches adopting a vector representation. We show that our model produces high-quality query expansion terms. Our expansion increase IR measures beyond expansion from current word-embeddings models and well-established traditional QE methods.
Cite
Text
Sordoni et al. "Learning Concept Embeddings for Query Expansion by Quantum Entropy Minimization." AAAI Conference on Artificial Intelligence, 2014. doi:10.1609/AAAI.V28I1.8933Markdown
[Sordoni et al. "Learning Concept Embeddings for Query Expansion by Quantum Entropy Minimization." AAAI Conference on Artificial Intelligence, 2014.](https://mlanthology.org/aaai/2014/sordoni2014aaai-learning/) doi:10.1609/AAAI.V28I1.8933BibTeX
@inproceedings{sordoni2014aaai-learning,
title = {{Learning Concept Embeddings for Query Expansion by Quantum Entropy Minimization}},
author = {Sordoni, Alessandro and Bengio, Yoshua and Nie, Jian-Yun},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2014},
pages = {1586-1592},
doi = {10.1609/AAAI.V28I1.8933},
url = {https://mlanthology.org/aaai/2014/sordoni2014aaai-learning/}
}