Optimal Dictionary for Least Squares Representation
Abstract
Dictionaries are collections of vectors used for the representation of a class of vectors in Euclidean spaces. Recent research on optimal dictionaries is focused on constructing dictionaries that offer sparse representations, i.e., $\ell_0$-optimal representations. Here we consider the problem of finding optimal dictionaries with which representations of a given class of vectors is optimal in an $\ell_2$-sense: optimality of representation is defined as attaining the minimal average $\ell_2$-norm of the coefficients used to represent the vectors in the given class. With the help of recent results on rank-1 decompositions of symmetric positive semidefinite matrices, we provide an explicit description of $\ell_2$-optimal dictionaries as well as their algorithmic constructions in polynomial time.
Cite
Text
Sheriff and Chatterjee. "Optimal Dictionary for Least Squares Representation." Journal of Machine Learning Research, 2017.Markdown
[Sheriff and Chatterjee. "Optimal Dictionary for Least Squares Representation." Journal of Machine Learning Research, 2017.](https://mlanthology.org/jmlr/2017/sheriff2017jmlr-optimal/)BibTeX
@article{sheriff2017jmlr-optimal,
title = {{Optimal Dictionary for Least Squares Representation}},
author = {Sheriff, Mohammed Rayyan and Chatterjee, Debasish},
journal = {Journal of Machine Learning Research},
year = {2017},
pages = {1-28},
volume = {18},
url = {https://mlanthology.org/jmlr/2017/sheriff2017jmlr-optimal/}
}