Empirical Risk Minimization with Approximations of Probabilistic Grammars
Abstract
Probabilistic grammars are generative statistical models that are useful for compositional and sequential structures. We present a framework, reminiscent of structural risk minimization, for empirical risk minimization of the parameters of a fixed probabilistic grammar using the log-loss. We derive sample complexity bounds in this framework that apply both to the supervised setting and the unsupervised setting.
Cite
Text
Smith and Cohen. "Empirical Risk Minimization with Approximations of Probabilistic Grammars." Neural Information Processing Systems, 2010.Markdown
[Smith and Cohen. "Empirical Risk Minimization with Approximations of Probabilistic Grammars." Neural Information Processing Systems, 2010.](https://mlanthology.org/neurips/2010/smith2010neurips-empirical/)BibTeX
@inproceedings{smith2010neurips-empirical,
title = {{Empirical Risk Minimization with Approximations of Probabilistic Grammars}},
author = {Smith, Noah A. and Cohen, Shay B.},
booktitle = {Neural Information Processing Systems},
year = {2010},
pages = {424-432},
url = {https://mlanthology.org/neurips/2010/smith2010neurips-empirical/}
}