Graph Contrastive Representation Learning with Input-Aware and Cluster-Aware Regularization
Abstract
With broad applications in network analysis and mining, Graph Contrastive Learning (GCL) is attracting growing research interest. Despite its successful usage in extracting concise but useful information through contrasting different augmented graph views as an outstanding self-supervised technique, GCL is facing a major challenge in how to make the semantic information extracted well-organized in structure and consequently easily understood by a downstream classifier. In this paper, we propose a novel cluster-based GCL framework to obtain a semantically well-formed structure of node embeddings via maximizing mutual information between input graph and output embeddings, which also provides a more clear decision boundary through accomplishing a cluster-level global-local contrastive task. We further argue in theory that the proposed method can correctly maximize the mutual information between an input graph and output embeddings. Moreover, we further improve the proposed method for better practical performance by incorporating additional refined gadgets, e.g. , measuring uncertainty of clustering and additional structural information extraction via local-local node-level contrasting module enhanced by Graph Cut. Lastly, extensive experiments are carried out to demonstrate the practical performance gain of our method in six real-world datasets over the most prevalent existing state-of-the-art models.
Cite
Text
Li et al. "Graph Contrastive Representation Learning with Input-Aware and Cluster-Aware Regularization." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2023. doi:10.1007/978-3-031-43415-0_39Markdown
[Li et al. "Graph Contrastive Representation Learning with Input-Aware and Cluster-Aware Regularization." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2023.](https://mlanthology.org/ecmlpkdd/2023/li2023ecmlpkdd-graph/) doi:10.1007/978-3-031-43415-0_39BibTeX
@inproceedings{li2023ecmlpkdd-graph,
title = {{Graph Contrastive Representation Learning with Input-Aware and Cluster-Aware Regularization}},
author = {Li, Jin and Li, Bingshi and Zhang, Qirong and Chen, Xinlong and Huang, Xinyang and Guo, Longkun and Fu, Yang-Geng},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2023},
pages = {666-682},
doi = {10.1007/978-3-031-43415-0_39},
url = {https://mlanthology.org/ecmlpkdd/2023/li2023ecmlpkdd-graph/}
}