Controlling the Complexity of HMM Systems by Regularization
Abstract
This paper introduces a method for regularization ofHMM systems that avoids parameter overfitting caused by insufficient training data. Regu(cid:173) larization is done by augmenting the EM training method by a penalty term that favors simple and smooth HMM systems. The penalty term is constructed as a mixture model of negative exponential distributions that is assumed to generate the state dependent emission probabilities of the HMMs. This new method is the successful transfer of a well known regularization approach in neural networks to the HMM domain and can be interpreted as a generalization of traditional state-tying for HMM sys(cid:173) tems. The effect of regularization is demonstrated for continuous speech recognition tasks by improving overfitted triphone models and by speaker adaptation with limited training data.
Cite
Text
Neukirchen and Rigoll. "Controlling the Complexity of HMM Systems by Regularization." Neural Information Processing Systems, 1998.Markdown
[Neukirchen and Rigoll. "Controlling the Complexity of HMM Systems by Regularization." Neural Information Processing Systems, 1998.](https://mlanthology.org/neurips/1998/neukirchen1998neurips-controlling/)BibTeX
@inproceedings{neukirchen1998neurips-controlling,
title = {{Controlling the Complexity of HMM Systems by Regularization}},
author = {Neukirchen, Christoph and Rigoll, Gerhard},
booktitle = {Neural Information Processing Systems},
year = {1998},
pages = {737-743},
url = {https://mlanthology.org/neurips/1998/neukirchen1998neurips-controlling/}
}