Graph-Based Dynamic Word Embeddings

Abstract

As time goes by, language evolves with word semantics changing. Unfortunately, traditional word embedding methods neglect the evolution of language and assume that word representations are static. Although contextualized word embedding models can capture the diverse representations of polysemous words, they ignore temporal information as well. To tackle the aforementioned challenges, we propose a graph-based dynamic word embedding (GDWE) model, which focuses on capturing the semantic drift of words continually. We introduce word-level knowledge graphs (WKGs) to store short-term and long-term knowledge. WKGs can provide rich structural information as supplement of lexical information, which help enhance the word embedding quality and capture semantic drift quickly. Theoretical analysis and extensive experiments validate the effectiveness of our GDWE on dynamic word embedding learning.

Cite

Text

Lu et al. "Graph-Based Dynamic Word Embeddings." International Joint Conference on Artificial Intelligence, 2022. doi:10.24963/IJCAI.2022/594

Markdown

[Lu et al. "Graph-Based Dynamic Word Embeddings." International Joint Conference on Artificial Intelligence, 2022.](https://mlanthology.org/ijcai/2022/lu2022ijcai-graph/) doi:10.24963/IJCAI.2022/594

BibTeX

@inproceedings{lu2022ijcai-graph,
  title     = {{Graph-Based Dynamic Word Embeddings}},
  author    = {Lu, Yuyin and Cheng, Xin and Liang, Ziran and Rao, Yanghui},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {4280-4288},
  doi       = {10.24963/IJCAI.2022/594},
  url       = {https://mlanthology.org/ijcai/2022/lu2022ijcai-graph/}
}