ActUp: Analyzing and Consolidating tSNE and UMAP
Abstract
TSNE and UMAP are popular dimensionality reduction algorithms due to their speed and interpretable low-dimensional embeddings. Despite their popularity, however, little work has been done to study their full span of differences. We theoretically and experimentally evaluate the space of parameters in the TSNE and UMAP algorithms and observe that a single one -- the normalization -- is responsible for switching between them. This, in turn, implies that a majority of the algorithmic differences can be toggled without affecting the embeddings. We discuss the implications this has on several theoretic claims behind UMAP, as well as how to reconcile them with existing TSNE interpretations. Based on our analysis, we provide a method (GDR) that combines previously incompatible techniques from TSNE and UMAP and can replicate the results of either algorithm. This allows our method to incorporate further improvements, such as an acceleration that obtains either method's outputs faster than UMAP. We release improved versions of TSNE, UMAP, and GDR that are fully plug-and-play with the traditional libraries.
Cite
Text
Draganov et al. "ActUp: Analyzing and Consolidating tSNE and UMAP." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/406Markdown
[Draganov et al. "ActUp: Analyzing and Consolidating tSNE and UMAP." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/draganov2023ijcai-actup/) doi:10.24963/IJCAI.2023/406BibTeX
@inproceedings{draganov2023ijcai-actup,
title = {{ActUp: Analyzing and Consolidating tSNE and UMAP}},
author = {Draganov, Andrew and Jørgensen, Jakob Rødsgaard and Scheel, Katrine and Mottin, Davide and Assent, Ira and Berry, Tyrus and Aslay, Çigdem},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2023},
pages = {3651-3658},
doi = {10.24963/IJCAI.2023/406},
url = {https://mlanthology.org/ijcai/2023/draganov2023ijcai-actup/}
}