Unsupervised Progressive Learning and the STAM Architecture
Abstract
We first pose the Unsupervised Progressive Learning (UPL) problem: an online representation learning problem in which the learner observes a non-stationary and unlabeled data stream, learning a growing number of features that persist over time even though the data is not stored or replayed. To solve the UPL problem we propose the Self-Taught Associative Memory (STAM) architecture. Layered hierarchies of STAM modules learn based on a combination of online clustering, novelty detection, forgetting outliers, and storing only prototypical features rather than specific examples. We evaluate STAM representations using clustering and classification tasks. While there are no existing learning scenarios that are directly comparable to UPL, we compare the STAM architecture with two recent continual learning models, Memory Aware Synapses (MAS) and Gradient Episodic Memories (GEM), after adapting them in the UPL setting.
Cite
Text
Smith et al. "Unsupervised Progressive Learning and the STAM Architecture." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/410Markdown
[Smith et al. "Unsupervised Progressive Learning and the STAM Architecture." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/smith2021ijcai-unsupervised/) doi:10.24963/IJCAI.2021/410BibTeX
@inproceedings{smith2021ijcai-unsupervised,
title = {{Unsupervised Progressive Learning and the STAM Architecture}},
author = {Smith, James Seale and Taylor, Cameron E. and Baer, Seth and Dovrolis, Constantine},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2021},
pages = {2979-2987},
doi = {10.24963/IJCAI.2021/410},
url = {https://mlanthology.org/ijcai/2021/smith2021ijcai-unsupervised/}
}