SNEQ: Semi-Supervised Attributed Network Embedding with Attention-Based Quantisation
Abstract
Learning accurate low-dimensional embeddings for a network is a crucial task as it facilitates many network analytics tasks. Moreover, the trained embeddings often require a significant amount of space to store, making storage and processing a challenge, especially as large-scale networks become more prevalent. In this paper, we present a novel semi-supervised network embedding and compression method, SNEQ, that is competitive with state-of-art embedding methods while being far more space- and time-efficient. SNEQ incorporates a novel quantisation method based on a self-attention layer that is trained in an end-to-end fashion, which is able to dramatically compress the size of the trained embeddings, thus reduces storage footprint and accelerates retrieval speed. Our evaluation on four real-world networks of diverse characteristics shows that SNEQ outperforms a number of state-of-the-art embedding methods in link prediction, node classification and node recommendation. Moreover, the quantised embedding shows a great advantage in terms of storage and time compared with continuous embeddings as well as hashing methods.
Cite
Text
He et al. "SNEQ: Semi-Supervised Attributed Network Embedding with Attention-Based Quantisation." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.5832Markdown
[He et al. "SNEQ: Semi-Supervised Attributed Network Embedding with Attention-Based Quantisation." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/he2020aaai-sneq/) doi:10.1609/AAAI.V34I04.5832BibTeX
@inproceedings{he2020aaai-sneq,
title = {{SNEQ: Semi-Supervised Attributed Network Embedding with Attention-Based Quantisation}},
author = {He, Tao and Gao, Lianli and Song, Jingkuan and Wang, Xin and Huang, Kejie and Li, Yuanfang},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {4091-4098},
doi = {10.1609/AAAI.V34I04.5832},
url = {https://mlanthology.org/aaai/2020/he2020aaai-sneq/}
}