2-MAP: Aligned Visualizations for Comparison of High-Dimensional Point Sets

Abstract

Visualization tools like t-SNE and UMAP give insight into the high-dimensional structure of datasets. When there are related datasets (such as the high-dimensional representations of image data created by two different Deep Learning architectures), roughly aligning those visualizations helps to highlight both the similarities and differences. In this paper we propose a method to align multiple low dimensional visualizations by adding an alignment term to the UMAP loss function. We provide an automated procedure to find a weight for this term that encourages the alignment but only minimally changes the fidelity of the underlying embedding.

Cite

Text

Liu et al. "2-MAP: Aligned Visualizations for Comparison of High-Dimensional Point Sets." Winter Conference on Applications of Computer Vision, 2020.

Markdown

[Liu et al. "2-MAP: Aligned Visualizations for Comparison of High-Dimensional Point Sets." Winter Conference on Applications of Computer Vision, 2020.](https://mlanthology.org/wacv/2020/liu2020wacv-2map/)

BibTeX

@inproceedings{liu2020wacv-2map,
  title     = {{2-MAP: Aligned Visualizations for Comparison of High-Dimensional Point Sets}},
  author    = {Liu, Xiaotong and Zhang, Zeyu and Xuan, Hong and Leontie, Roxana and Stylianou, Abby and Pless, Robert},
  booktitle = {Winter Conference on Applications of Computer Vision},
  year      = {2020},
  url       = {https://mlanthology.org/wacv/2020/liu2020wacv-2map/}
}