Spherical Text Embedding

Abstract

Unsupervised text embedding has shown great power in a wide range of NLP tasks. While text embeddings are typically learned in the Euclidean space, directional similarity is often more effective in tasks such as word similarity and document clustering, which creates a gap between the training stage and usage stage of text embedding. To close this gap, we propose a spherical generative model based on which unsupervised word and paragraph embeddings are jointly learned. To learn text embeddings in the spherical space, we develop an efficient optimization algorithm with convergence guarantee based on Riemannian optimization. Our model enjoys high efficiency and achieves state-of-the-art performances on various text embedding tasks including word similarity and document clustering.

Cite

Text

Meng et al. "Spherical Text Embedding." Neural Information Processing Systems, 2019.

Markdown

[Meng et al. "Spherical Text Embedding." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/meng2019neurips-spherical/)

BibTeX

@inproceedings{meng2019neurips-spherical,
  title     = {{Spherical Text Embedding}},
  author    = {Meng, Yu and Huang, Jiaxin and Wang, Guangyuan and Zhang, Chao and Zhuang, Honglei and Kaplan, Lance and Han, Jiawei},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {8208-8217},
  url       = {https://mlanthology.org/neurips/2019/meng2019neurips-spherical/}
}