Yun, Sukwon

10 publications

NeurIPS 2025 $\texttt{BetaConform}$: Efficient MAP Estimation of LLM Ensemble Judgment Performance with Prior Transfer Huaizhi Qu, Inyoung Choi, Zhen Tan, Song Wang, Sukwon Yun, Qi Long, Faizan Siddiqui, Kwonjoon Lee, Tianlong Chen
ICML 2025 $\texttt{I}$^2$MoE$: Interpretable Multimodal Interaction-Aware Mixture-of-Experts Jiayi Xin, Sukwon Yun, Jie Peng, Inyoung Choi, Jenna L. Ballard, Tianlong Chen, Qi Long
ICLR 2025 Cut the Crap: An Economical Communication Pipeline for LLM-Based Multi-Agent Systems Guibin Zhang, Yanwei Yue, Zhixun Li, Sukwon Yun, Guancheng Wan, Kun Wang, Dawei Cheng, Jeffrey Xu Yu, Tianlong Chen
ICML 2025 Modalities Contribute Unequally: Enhancing Medical Multi-Modal Learning Through Adaptive Modality Token Re-Balancing Jie Peng, Jenna L. Ballard, Mohan Zhang, Sukwon Yun, Jiayi Xin, Qi Long, Yanyong Zhang, Tianlong Chen
ICLR 2025 PortLLM: Personalizing Evolving Large Language Models with Training-Free and Portable Model Patches Rana Shahroz, Pingzhi Li, Sukwon Yun, Zhenyu Wang, Shahriar Nirjon, Chau-Wai Wong, Tianlong Chen
CPAL 2025 Sparse MoE as a New Treatment: Addressing Forgetting, Fitting, Learning Issues in Multi-Modal Multi-Task Learning Jie Peng, Sukwon Yun, Kaixiong Zhou, Ruida Zhou, Thomas Hartvigsen, Yanyong Zhang, Zhangyang Wang, Tianlong Chen
ICLR 2025 Subgraph Federated Learning for Local Generalization Sungwon Kim, Yoonho Lee, Yunhak Oh, Namkyeong Lee, Sukwon Yun, Junseok Lee, Sein Kim, Carl Yang, Chanyoung Park
NeurIPS 2025 Training Robust Graph Neural Networks by Modeling Noise Dependencies Yeonjun In, Kanghoon Yoon, Sukwon Yun, Kibum Kim, Sungchul Kim, Chanyoung Park
NeurIPS 2024 Flex-MoE: Modeling Arbitrary Modality Combination via the Flexible Mixture-of-Experts Sukwon Yun, Inyoung Choi, Jie Peng, Yangfan Wu, Jingxuan Bao, Qiyiwen Zhang, Jiayi Xin, Qi Long, Tianlong Chen
ECCV 2024 Mew: Multiplexed Immunofluorescence Image Analysis Through an Efficient Multiplex Network Sukwon Yun, Jie Peng, Alexandro E Trevino, Chanyoung Park, Tianlong Chen