Multiscale Dictionary Learning: Non-Asymptotic Bounds and Robustness
Abstract
High-dimensional datasets are well-approximated by low- dimensional structures. Over the past decade, this empirical observation motivated the investigation of detection, measurement, and modeling techniques to exploit these low- dimensional intrinsic structures, yielding numerous implications for high-dimensional statistics, machine learning, and signal processing. Manifold learning (where the low-dimensional structure is a manifold) and dictionary learning (where the low- dimensional structure is the set of sparse linear combinations of vectors from a finite dictionary) are two prominent theoretical and computational frameworks in this area. Despite their ostensible distinction, the recently-introduced Geometric Multi-Resolution Analysis (GMRA) provides a robust, computationally efficient, multiscale procedure for simultaneously learning manifolds and dictionaries.
Cite
Text
Maggioni et al. "Multiscale Dictionary Learning: Non-Asymptotic Bounds and Robustness." Journal of Machine Learning Research, 2016.Markdown
[Maggioni et al. "Multiscale Dictionary Learning: Non-Asymptotic Bounds and Robustness." Journal of Machine Learning Research, 2016.](https://mlanthology.org/jmlr/2016/maggioni2016jmlr-multiscale/)BibTeX
@article{maggioni2016jmlr-multiscale,
title = {{Multiscale Dictionary Learning: Non-Asymptotic Bounds and Robustness}},
author = {Maggioni, Mauro and Minsker, Stanislav and Strawn, Nate},
journal = {Journal of Machine Learning Research},
year = {2016},
pages = {1-51},
volume = {17},
url = {https://mlanthology.org/jmlr/2016/maggioni2016jmlr-multiscale/}
}