Interpolating Convex and Non-Convex Tensor Decompositions via the Subspace Norm
Abstract
We consider the problem of recovering a low-rank tensor from its noisy observation. Previous work has shown a recovery guarantee with signal to noise ratio $O(n^{\ceil{K/2}/2})$ for recovering a $K$th order rank one tensor of size $n\times \cdots \times n$ by recursive unfolding. In this paper, we first improve this bound to $O(n^{K/4})$ by a much simpler approach, but with a more careful analysis. Then we propose a new norm called the \textit{subspace} norm, which is based on the Kronecker products of factors obtained by the proposed simple estimator. The imposed Kronecker structure allows us to show a nearly ideal $O(\sqrt{n}+\sqrt{H^{K-1}})$ bound, in which the parameter $H$ controls the blend from the non-convex estimator to mode-wise nuclear norm minimization. Furthermore, we empirically demonstrate that the subspace norm achieves the nearly ideal denoising performance even with $H=O(1)$.
Cite
Text
Zheng and Tomioka. "Interpolating Convex and Non-Convex Tensor Decompositions via the Subspace Norm." Neural Information Processing Systems, 2015.Markdown
[Zheng and Tomioka. "Interpolating Convex and Non-Convex Tensor Decompositions via the Subspace Norm." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/zheng2015neurips-interpolating/)BibTeX
@inproceedings{zheng2015neurips-interpolating,
title = {{Interpolating Convex and Non-Convex Tensor Decompositions via the Subspace Norm}},
author = {Zheng, Qinqing and Tomioka, Ryota},
booktitle = {Neural Information Processing Systems},
year = {2015},
pages = {3106-3113},
url = {https://mlanthology.org/neurips/2015/zheng2015neurips-interpolating/}
}