RSDNE: Exploring Relaxed Similarity and Dissimilarity from Completely-Imbalanced Labels for Network Embedding

Abstract

Network embedding, aiming to project a network into a low-dimensional space, is increasingly becoming a focus of network research. Semi-supervised network embedding takes advantage of labeled data, and has shown promising performance. However, existing semi-supervised methods would get unappealing results in the completely-imbalanced label setting where some classes have no labeled nodes at all. To alleviate this, we propose a novel semi-supervised network embedding method, termed Relaxed Similarity and Dissimilarity Network Embedding (RSDNE). Specifically, to benefit from the completely-imbalanced labels, RSDNE guarantees both intra-class similarity and inter-class dissimilarity in an approximate way. Experimental results on several real-world datasets demonstrate the superiority of the proposed method.

Cite

Text

Wang et al. "RSDNE: Exploring Relaxed Similarity and Dissimilarity from Completely-Imbalanced Labels for Network Embedding." AAAI Conference on Artificial Intelligence, 2018. doi:10.1609/AAAI.V32I1.11242

Markdown

[Wang et al. "RSDNE: Exploring Relaxed Similarity and Dissimilarity from Completely-Imbalanced Labels for Network Embedding." AAAI Conference on Artificial Intelligence, 2018.](https://mlanthology.org/aaai/2018/wang2018aaai-rsdne/) doi:10.1609/AAAI.V32I1.11242

BibTeX

@inproceedings{wang2018aaai-rsdne,
  title     = {{RSDNE: Exploring Relaxed Similarity and Dissimilarity from Completely-Imbalanced Labels for Network Embedding}},
  author    = {Wang, Zheng and Ye, Xiaojun and Wang, Chaokun and Wu, Yuexin and Wang, Changping and Liang, Kaiwen},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {475-482},
  doi       = {10.1609/AAAI.V32I1.11242},
  url       = {https://mlanthology.org/aaai/2018/wang2018aaai-rsdne/}
}