Local Loss Optimization in Operator Models: A New Insight into Spectral Learning
Abstract
This paper re-visits the spectral method for learning latent variable models defined in terms of observable operators. We give a new perspective on the method, showing that operators can be recovered by minimizing a loss defined on a finite subset of the domain. This leads to a derivation of a non-convex optimization similar to the spectral method. We also propose a regularized convex relaxation of this optimization. In practice our experiments show that a continuous regularization parameter (in contrast with the discrete number of states in the original method) allows a better trade-off between accuracy and model complexity. We also prove that in general, a randomized strategy for choosing the local loss succeeds with high probability.
Cite
Text
Balle et al. "Local Loss Optimization in Operator Models: A New Insight into Spectral Learning." International Conference on Machine Learning, 2012.Markdown
[Balle et al. "Local Loss Optimization in Operator Models: A New Insight into Spectral Learning." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/balle2012icml-local/)BibTeX
@inproceedings{balle2012icml-local,
title = {{Local Loss Optimization in Operator Models: A New Insight into Spectral Learning}},
author = {Balle, Borja and Quattoni, Ariadna and Carreras, Xavier},
booktitle = {International Conference on Machine Learning},
year = {2012},
url = {https://mlanthology.org/icml/2012/balle2012icml-local/}
}