Modeling Dialogues with Hashcode Representations: A Nonparametric Approach
Abstract
We propose a novel dialogue modeling framework, the first-ever nonparametric kernel functions based approach for dialogue modeling, which learns hashcodes as text representations; unlike traditional deep learning models, it handles well relatively small datasets, while also scaling to large ones. We also derive a novel lower bound on mutual information, used as a model-selection criterion favoring representations with better alignment between the utterances of participants in a collaborative dialogue setting, as well as higher predictability of the generated responses. As demonstrated on three real-life datasets, including prominently psychotherapy sessions, the proposed approach significantly outperforms several state-of-art neural network based dialogue systems, both in terms of computational efficiency, reducing training time from days or weeks to hours, and the response quality, achieving an order of magnitude improvement over competitors in frequency of being chosen as the best model by human evaluators.
Cite
Text
Garg et al. "Modeling Dialogues with Hashcode Representations: A Nonparametric Approach." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.5813Markdown
[Garg et al. "Modeling Dialogues with Hashcode Representations: A Nonparametric Approach." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/garg2020aaai-modeling/) doi:10.1609/AAAI.V34I04.5813BibTeX
@inproceedings{garg2020aaai-modeling,
title = {{Modeling Dialogues with Hashcode Representations: A Nonparametric Approach}},
author = {Garg, Sahil and Rish, Irina and Cecchi, Guillermo A. and Goyal, Palash and Ghazarian, Sarik and Gao, Shuyang and Steeg, Greg Ver and Galstyan, Aram},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {3970-3979},
doi = {10.1609/AAAI.V34I04.5813},
url = {https://mlanthology.org/aaai/2020/garg2020aaai-modeling/}
}