Sequentially Finding the N-Best List in Hidden Markov Models

Abstract

We propose a novel method to obtain the N-best list of hypotheses in hidden Markov model (HMM). We show that the entire information needed to compute the N-best list from the HMM trellis graph is encapsulated in entities that can be computed in a single forward-backward iteration that usually yields the most likely state sequence. The hypotheses list can then be extracted in a sequential manner from these entities without the need to refer back to the original data of the HMM. Furthermore, our approach can yield significant savings of computational time when compared to traditional methods.

Cite

Text

Nilsson and Goldberger. "Sequentially Finding the N-Best List in Hidden Markov Models." International Joint Conference on Artificial Intelligence, 2001.

Markdown

[Nilsson and Goldberger. "Sequentially Finding the N-Best List in Hidden Markov Models." International Joint Conference on Artificial Intelligence, 2001.](https://mlanthology.org/ijcai/2001/nilsson2001ijcai-sequentially/)

BibTeX

@inproceedings{nilsson2001ijcai-sequentially,
  title     = {{Sequentially Finding the N-Best List in Hidden Markov Models}},
  author    = {Nilsson, Dennis and Goldberger, Jacob},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2001},
  pages     = {1280-1285},
  url       = {https://mlanthology.org/ijcai/2001/nilsson2001ijcai-sequentially/}
}