Recursive SNE: Fast Prototype-Based T-SNE for Large-Scale and Online Data
Abstract
Dimensionality reduction techniques like t-SNE excel at visualizing structure in high-dimensional data but incur high computational costs that limit their use on large or streaming datasets. We introduce the Recursive SNE (RSNE) framework, which extends t-SNE with two complementary strategies: i-RSNE for real-time, point-wise updates and Bi-RSNE for efficient batch processing. Across diverse settings, including standard image benchmarks (CIFAR10/CIFAR100) with DINOv2 and CLIP features, domain-specific iROADS road scenes, neuroimaging data from the Haxby fMRI dataset, and long-term climate records, RSNE delivers substantial speedups over Barnes–Hut t-SNE while maintaining or even improving cluster separability. By combining a lightweight prototype-based initialization with localized KL-divergence refinements, RSNE offers a scalable and adaptable framework for both large-scale offline embedding and on-the-fly visualization of streaming data.
Cite
Text
Aghasanli and Angelov. "Recursive SNE: Fast Prototype-Based T-SNE for Large-Scale and Online Data." Transactions on Machine Learning Research, 2025.Markdown
[Aghasanli and Angelov. "Recursive SNE: Fast Prototype-Based T-SNE for Large-Scale and Online Data." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/aghasanli2025tmlr-recursive/)BibTeX
@article{aghasanli2025tmlr-recursive,
title = {{Recursive SNE: Fast Prototype-Based T-SNE for Large-Scale and Online Data}},
author = {Aghasanli, Agil and Angelov, Plamen P},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/aghasanli2025tmlr-recursive/}
}