Atomic Recovery Property for Multi-View Subspace-Preserving Recovery
Abstract
Text-Attributed Graphs (TAGs) are vital for modeling entity relationships across various domains. Graph Neural Networks have become cornerstone for processing graph structures, while the integration of text attributes remains a prominent research. The development of Large Language Models (LLMs) provides new opportunities for advancing textual encoding in TAGs. However, LLMs face challenges in specialized domains due to their limited task-specific knowledge, and fine-tuning them for specific tasks demands significant resources. To cope with the above challenges, we propose HiTuner, a novel framework that leverages fine-tuned Pre-trained Language Models (PLMs) with domain expertise as tuner to enhance the hierarchical LLM contextualized representations for modeling TAGs. Specifically, we first strategically select hierarchical hidden states of LLM to form a set of diverse and complementary descriptions as input for the sparse projection operator. Concurrently, a hybrid representation learning is developed to amalgamate the broad linguistic comprehension of LLMs with task-specific insights of the fine-tuned PLMs. Finally, HiTuner employs a confidence network to adaptively fuse the semantically-augmented representations. Empirical results across benchmark datasets spanning various domains validate the effectiveness of the proposed framework. Our codes are available at: https://github.com/ZihanFang11/HiTuner
Cite
Text
Wang. "Atomic Recovery Property for Multi-View Subspace-Preserving Recovery." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/569Markdown
[Wang. "Atomic Recovery Property for Multi-View Subspace-Preserving Recovery." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/wang2024ijcai-atomic/) doi:10.24963/ijcai.2024/569BibTeX
@inproceedings{wang2024ijcai-atomic,
title = {{Atomic Recovery Property for Multi-View Subspace-Preserving Recovery}},
author = {Wang, Yulong},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {5144-5152},
doi = {10.24963/ijcai.2024/569},
url = {https://mlanthology.org/ijcai/2024/wang2024ijcai-atomic/}
}