Spectral Learning of Latent-Variable PCFGs: Algorithms and Sample Complexity
Abstract
We introduce a spectral learning algorithm for latent-variable PCFGs (Matsuzaki et al., 2005; Petrov et al., 2006). Under a separability (singular value) condition, we prove that the method provides statistically consistent parameter estimates. Our result rests on three theorems: the first gives a tensor form of the inside- outside algorithm for PCFGs; the second shows that the required tensors can be estimated directly from training examples where hidden-variable values are missing; the third gives a PAC-style convergence bound for the estimation method.
Cite
Text
Cohen et al. "Spectral Learning of Latent-Variable PCFGs: Algorithms and Sample Complexity." Journal of Machine Learning Research, 2014.Markdown
[Cohen et al. "Spectral Learning of Latent-Variable PCFGs: Algorithms and Sample Complexity." Journal of Machine Learning Research, 2014.](https://mlanthology.org/jmlr/2014/cohen2014jmlr-spectral/)BibTeX
@article{cohen2014jmlr-spectral,
title = {{Spectral Learning of Latent-Variable PCFGs: Algorithms and Sample Complexity}},
author = {Cohen, Shay B. and Stratos, Karl and Collins, Michael and Foster, Dean P. and Ungar, Lyle},
journal = {Journal of Machine Learning Research},
year = {2014},
pages = {2399-2449},
volume = {15},
url = {https://mlanthology.org/jmlr/2014/cohen2014jmlr-spectral/}
}