Unsupervised Slow Subspace-Learning from Stationary Processes
Abstract
We propose a method of unsupervised learning from stationary, vector-valued processes. A low-dimensional subspace is selected on the basis of a criterion which rewards data-variance (like PSA) and penalizes the variance of the velocity vector, thus exploiting the short-time dependencies of the process. We prove error bounds in terms of the β -mixing coefficients and consistency for absolutely regular processes. Experiments with image recognition demonstrate the algorithms ability to learn geometrically invariant feature maps.
Cite
Text
Maurer. "Unsupervised Slow Subspace-Learning from Stationary Processes." International Conference on Algorithmic Learning Theory, 2006. doi:10.1007/11894841_29Markdown
[Maurer. "Unsupervised Slow Subspace-Learning from Stationary Processes." International Conference on Algorithmic Learning Theory, 2006.](https://mlanthology.org/alt/2006/maurer2006alt-unsupervised/) doi:10.1007/11894841_29BibTeX
@inproceedings{maurer2006alt-unsupervised,
title = {{Unsupervised Slow Subspace-Learning from Stationary Processes}},
author = {Maurer, Andreas},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2006},
pages = {363-377},
doi = {10.1007/11894841_29},
url = {https://mlanthology.org/alt/2006/maurer2006alt-unsupervised/}
}