Multi-Attribute Multi-Grained Adaptation of Pre-Trained Language Models for Text Understanding from Bayesian Perspective
Abstract
Current neural networks often employ multi-domain-learning or attribute-injecting mechanisms to incorporate non-independent and identically distributed (non-IID) information for text understanding tasks by capturing individual characteristics and the relationships among samples. However, the extent of the impact of non-IID information and how these methods affect pre-trained language models (PLMs) remains unclear. This study revisits the assumption that non-IID information enhances PLMs to achieve performance improvements from a Bayesian perspective, which unearths and integrates non-IID and IID features. Furthermore, we proposed a multi-attribute multi-grained framework for PLM adaptations (M2A), which combines multi-attribute and multi-grained views to mitigate uncertainty in a lightweight manner. We evaluate M2A through prevalent text-understanding datasets and demonstrate its superior performance, mainly when data are implicitly non-IID, and PLMs scale larger.
Cite
Text
Zhang et al. "Multi-Attribute Multi-Grained Adaptation of Pre-Trained Language Models for Text Understanding from Bayesian Perspective." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I24.34791Markdown
[Zhang et al. "Multi-Attribute Multi-Grained Adaptation of Pre-Trained Language Models for Text Understanding from Bayesian Perspective." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/zhang2025aaai-multi-d/) doi:10.1609/AAAI.V39I24.34791BibTeX
@inproceedings{zhang2025aaai-multi-d,
title = {{Multi-Attribute Multi-Grained Adaptation of Pre-Trained Language Models for Text Understanding from Bayesian Perspective}},
author = {Zhang, You and Wang, Jin and Yu, Liang-Chih and Xu, Dan and Zhang, Xuejie},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {25967-25975},
doi = {10.1609/AAAI.V39I24.34791},
url = {https://mlanthology.org/aaai/2025/zhang2025aaai-multi-d/}
}