Tang, Anke

9 publications

NeurIPS 2025 Continual Model Merging Without Data: Dual Projections for Balancing Stability and Plasticity Enneng Yang, Anke Tang, Li Shen, Guibing Guo, Xingwei Wang, Xiaochun Cao, Jie Zhang
NeurIPS 2025 Merging on the Fly Without Retraining: A Sequential Approach to Scalable Continual Model Merging Anke Tang, Enneng Yang, Li Shen, Yong Luo, Han Hu, Lefei Zhang, Bo Du, Dacheng Tao
ICLR 2025 Mitigating the Backdoor Effect for Multi-Task Model Merging via Safety-Aware Subspace Jinluan Yang, Anke Tang, Didi Zhu, Zhengyu Chen, Li Shen, Fei Wu
NeurIPS 2025 Mix Data or Merge Models? Balancing the Helpfulness, Honesty, and Harmlessness of Large Language Model via Model Merging Jinluan Yang, Dingnan Jin, Anke Tang, Li Shen, Didi Zhu, Zhengyu Chen, Ziyu Zhao, Daixin Wang, Qing Cui, Zhiqiang Zhang, Jun Zhou, Fei Wu, Kun Kuang
ICML 2025 Modeling Multi-Task Model Merging as Adaptive Projective Gradient Descent Yongxian Wei, Anke Tang, Li Shen, Zixuan Hu, Chun Yuan, Xiaochun Cao
ICML 2025 Targeted Low-Rank Refinement: Enhancing Sparse Language Models with Precision Li Shen, Anke Tang, Yong Luo, Tao Sun, Han Hu, Xiaochun Cao
ICML 2024 Merging Multi-Task Models via Weight-Ensembling Mixture of Experts Anke Tang, Li Shen, Yong Luo, Nan Yin, Lefei Zhang, Dacheng Tao
ICLR 2024 Parameter-Efficient Multi-Task Model Fusion with Partial Linearization Anke Tang, Li Shen, Yong Luo, Yibing Zhan, Han Hu, Bo Du, Yixin Chen, Dacheng Tao
IJCAI 2023 Improving Heterogeneous Model Reuse by Density Estimation Anke Tang, Yong Luo, Han Hu, Fengxiang He, Kehua Su, Bo Du, Yixin Chen, Dacheng Tao