Open Problem: Tightness of Maximum Likelihood Semidefinite Relaxations
Abstract
We have observed an interesting, yet unexplained, phenomenon: Semidefinite programming (SDP) based relaxations of maximum likelihood estimators (MLE) tend to be tight in recovery problems with noisy data, even when MLE cannot exactly recover the ground truth. Several results establish tightness of SDP based relaxations in the regime where exact recovery from MLE is possible. However, to the best of our knowledge, their tightness is not understood beyond this regime. As an illustrative example, we focus on the generalized Procrustes problem.
Cite
Text
Bandeira et al. "Open Problem: Tightness of Maximum Likelihood Semidefinite Relaxations." Annual Conference on Computational Learning Theory, 2014.Markdown
[Bandeira et al. "Open Problem: Tightness of Maximum Likelihood Semidefinite Relaxations." Annual Conference on Computational Learning Theory, 2014.](https://mlanthology.org/colt/2014/bandeira2014colt-open/)BibTeX
@inproceedings{bandeira2014colt-open,
title = {{Open Problem: Tightness of Maximum Likelihood Semidefinite Relaxations}},
author = {Bandeira, Afonso S. and Khoo, Yuehaw and Singer, Amit},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2014},
pages = {1265-1267},
url = {https://mlanthology.org/colt/2014/bandeira2014colt-open/}
}