Relative Density Nets: A New Way to Combine Backpropagation with HMM's
Abstract
Logistic units in the first hidden layer of a feedforward neural net(cid:173) work compute the relative probability of a data point under two Gaussians. This leads us to consider substituting other density models. We present an architecture for performing discriminative learning of Hidden Markov Models using a network of many small HMM's. Experiments on speech data show it to be superior to the standard method of discriminatively training HMM's.
Cite
Text
Brown and Hinton. "Relative Density Nets: A New Way to Combine Backpropagation with HMM's." Neural Information Processing Systems, 2001.Markdown
[Brown and Hinton. "Relative Density Nets: A New Way to Combine Backpropagation with HMM's." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/brown2001neurips-relative/)BibTeX
@inproceedings{brown2001neurips-relative,
title = {{Relative Density Nets: A New Way to Combine Backpropagation with HMM's}},
author = {Brown, Andrew D. and Hinton, Geoffrey E.},
booktitle = {Neural Information Processing Systems},
year = {2001},
pages = {1149-1156},
url = {https://mlanthology.org/neurips/2001/brown2001neurips-relative/}
}