Magical: Medical Lay Language Generation via Semantic Invariance and Layperson-Tailored Adaptation

Abstract

Medical Lay Language Generation (MLLG) plays a vital role in improving the accessibility of complex scientific content for broader audiences. Recent literature to MLLG commonly employ parameter-efficient fine-tuning methods such as Low-Rank Adaptation (LoRA) to fine-tuning large language models (LLMs) using paired expert-lay language datasets. However, LoRA struggles with the challenges posed by multi-source heterogeneous MLLG datasets. Specifically, through a series of exploratory experiments, we reveal that standard LoRA fail to meet the requirement for semantic fidelity and diverse lay-style generation in MLLG task. To address these limitations, we propose Magical, an asymmetric LoRA architecture tailored for MLLG under heterogeneous data scenarios. Magical employs a shared matrix A for abstractive summarization, along with multiple isolated matrices B for diverse lay-style generation. To preserve semantic fidelity during the lay language generation process, Magical introduces a Semantic Invariance Constraint to mitigate semantic subspace shifts on matrix A. Furthermore, to better adapt to diverse lay-style generation, Magical incorporates the Recommendation-guided Switch, an externally interface to prompt the LLM to switch between different matrices B. Experimental results on three real-world lay language generation datasets demonstrate that Magical consistently outperforms prompt-based methods, vanilla LoRA, and its recent variants, while also reducing trainable parameters by 31.66%. Our code is publicly available at https://github.com/tianlwang/Magical.git.

Cite

Text

Liao et al. "Magical: Medical Lay Language Generation via Semantic Invariance and Layperson-Tailored Adaptation." Advances in Neural Information Processing Systems, 2025.

Markdown

[Liao et al. "Magical: Medical Lay Language Generation via Semantic Invariance and Layperson-Tailored Adaptation." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/liao2025neurips-magical/)

BibTeX

@inproceedings{liao2025neurips-magical,
  title     = {{Magical: Medical Lay Language Generation via Semantic Invariance and Layperson-Tailored Adaptation}},
  author    = {Liao, Weibin and Wang, Tianlong and Zhu, Yinghao and Wang, Yasha and Gao, Junyi and Ma, Liantao},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/liao2025neurips-magical/}
}