An Equal-Size Hard EM Algorithm for Diverse Dialogue Generation

Abstract

Open-domain dialogue systems aim to interact with humans through natural language texts in an open-ended fashion. Despite the recent success of super large dialogue systems such as ChatGPT, using medium-to-small-sized dialogue systems remains the common practice as they are more lightweight and accessible; however, generating diverse dialogue responses is challenging, especially with smaller models. In this work, we propose an Equal-size Hard Expectation--Maximization (EqHard-EM) algorithm to train a multi-decoder model for diverse dialogue generation. Our algorithm assigns a sample to a decoder in a hard manner and additionally imposes an equal-assignment constraint to ensure that all decoders are well-trained. We provide detailed theoretical analysis to justify our approach. Further, experiments on two large-scale open-domain dialogue datasets verify that our EqHard-EM algorithm generates high-quality diverse responses.

Cite

Text

Wen et al. "An Equal-Size Hard EM Algorithm for Diverse Dialogue Generation." International Conference on Learning Representations, 2023.

Markdown

[Wen et al. "An Equal-Size Hard EM Algorithm for Diverse Dialogue Generation." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/wen2023iclr-equalsize/)

BibTeX

@inproceedings{wen2023iclr-equalsize,
  title     = {{An Equal-Size Hard EM Algorithm for Diverse Dialogue Generation}},
  author    = {Wen, Yuqiao and Hao, Yongchang and Cao, Yanshuai and Mou, Lili},
  booktitle = {International Conference on Learning Representations},
  year      = {2023},
  url       = {https://mlanthology.org/iclr/2023/wen2023iclr-equalsize/}
}