Stable Spectral Learning Based on Schur Decomposition
Abstract
Spectral methods are a powerful tool for inferring the parameters of certain classes of probability distributions by means of standard eigenvalue-eigenvector decompositions. Spectral algorithms can be orders of magnitude faster than log-likelihood based and related iterative methods, and, thanks to the uniqueness of the spectral decomposition, they enjoy global optimality guarantees. In practice, however, the applicability of spectral methods is limited due to their sensitivity to model misspecification, which can cause instability issues in the case of non-exact models. We present a new spectral approach that is based on the Schur triangularization of a family of nearly-commuting matrices, and we carry out the corresponding theoretical analysis. Our main result is a theoretical bound on the estimation error, which is shown to depend directly on the model misspecification error and inversely on an eigenvalue separation gap. Numerical experiments show that the proposed method is more stable, and performs better in general, than the classical spectral approach based on direct matrix diagonalization.
Cite
Text
Colombo and Vlassis. "Stable Spectral Learning Based on Schur Decomposition." Conference on Uncertainty in Artificial Intelligence, 2015.Markdown
[Colombo and Vlassis. "Stable Spectral Learning Based on Schur Decomposition." Conference on Uncertainty in Artificial Intelligence, 2015.](https://mlanthology.org/uai/2015/colombo2015uai-stable/)BibTeX
@inproceedings{colombo2015uai-stable,
title = {{Stable Spectral Learning Based on Schur Decomposition}},
author = {Colombo, Nicolò and Vlassis, Nikos},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2015},
pages = {220-227},
url = {https://mlanthology.org/uai/2015/colombo2015uai-stable/}
}