Proximal Methods for Sparse Hierarchical Dictionary Learning
Abstract
We propose to combine two approaches for modeling data admitting sparse representations: on the one hand, dictionary learning has proven effective for various signal processing tasks. On the other hand, recent work on structured sparsity provides a natural framework for modeling dependencies between dictionary elements. We thus consider a tree-structured sparse regularization to learn dictionaries embedded in a hierarchy. The involved proximal operator is computable exactly via a primal-dual method, allowing the use of accelerated gradient techniques. Experiments show that for natural image patches, learned dictionary elements organize themselves in such a hierarchical structure, leading to an improved performance for restoration tasks. When applied to text documents, our method learns hierarchies of topics, thus providing a competitive alternative to probabilistic topic models.
Cite
Text
Jenatton et al. "Proximal Methods for Sparse Hierarchical Dictionary Learning." International Conference on Machine Learning, 2010.Markdown
[Jenatton et al. "Proximal Methods for Sparse Hierarchical Dictionary Learning." International Conference on Machine Learning, 2010.](https://mlanthology.org/icml/2010/jenatton2010icml-proximal/)BibTeX
@inproceedings{jenatton2010icml-proximal,
title = {{Proximal Methods for Sparse Hierarchical Dictionary Learning}},
author = {Jenatton, Rodolphe and Mairal, Julien and Obozinski, Guillaume and Bach, Francis R.},
booktitle = {International Conference on Machine Learning},
year = {2010},
pages = {487-494},
url = {https://mlanthology.org/icml/2010/jenatton2010icml-proximal/}
}