Efficient Estimation of OOMs
Abstract
A standard method to obtain stochastic models for symbolic time series is to train state-emitting hidden Markov models (SE-HMMs) with the Baum-Welch algorithm. Based on observable operator models (OOMs), in the last few months a number of novel learning algorithms for similar purposes have been developed: (1,2) two versions of an "efficiency sharpening" (ES) algorithm, which iteratively improves the statistical efficiency of a sequence of OOM estimators, (3) a constrained gradient descent ML estimator for transition-emitting HMMs (TE-HMMs). We give an overview on these algorithms and compare them with SE-HMM/EM learning on synthetic and real-life data.
Cite
Text
Jaeger et al. "Efficient Estimation of OOMs." Neural Information Processing Systems, 2005.Markdown
[Jaeger et al. "Efficient Estimation of OOMs." Neural Information Processing Systems, 2005.](https://mlanthology.org/neurips/2005/jaeger2005neurips-efficient/)BibTeX
@inproceedings{jaeger2005neurips-efficient,
title = {{Efficient Estimation of OOMs}},
author = {Jaeger, Herbert and Zhao, Mingjie and Kolling, Andreas},
booktitle = {Neural Information Processing Systems},
year = {2005},
pages = {555-562},
url = {https://mlanthology.org/neurips/2005/jaeger2005neurips-efficient/}
}