Fully Dynamic Embedding into $\ell_p$ Spaces

Abstract

Metric embeddings are fundamental in machine learning, enabling similarity search, dimensionality reduction, and representation learning. They underpin modern architectures like transformers and large language models, facilitating scalable training and improved generalization. Theoretically, the classic problem in embedding design is mapping arbitrary metrics into $\ell_p$ spaces while approximately preserving pairwise distances. We study this problem in a fully dynamic setting, where the underlying metric is a graph metric subject to edge insertions and deletions. Our goal is to maintain an efficient embedding after each update. We present the first fully dynamic algorithm for this problem, achieving $O(\log(n))^{2q} O(\log(nW))^{q-1}$ expected distortion with $O(m^{1/q + o(1)})$ update time and $O(q \log(n) \log(nW))$ query time, where $q \ge 2$ is an integer parameter.

Cite

Text

Banihashem et al. "Fully Dynamic Embedding into $\ell_p$ Spaces." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Banihashem et al. "Fully Dynamic Embedding into $\ell_p$ Spaces." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/banihashem2025icml-fully/)

BibTeX

@inproceedings{banihashem2025icml-fully,
  title     = {{Fully Dynamic Embedding into $\ell_p$ Spaces}},
  author    = {Banihashem, Kiarash and Chen, Xiang and Hajiaghayi, Mohammadtaghi and Kim, Sungchul and Mahadik, Kanak and Rossi, Ryan A. and Yu, Tong},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {2894-2909},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/banihashem2025icml-fully/}
}