Cheng, Zebang

5 publications

ICML 2025 AffectGPT: A New Dataset, Model, and Benchmark for Emotion Understanding with Multimodal Large Language Models Zheng Lian, Haoyu Chen, Lan Chen, Haiyang Sun, Licai Sun, Yong Ren, Zebang Cheng, Bin Liu, Rui Liu, Xiaojiang Peng, Jiangyan Yi, Jianhua Tao
AAAI 2025 DREAM: Decoupled Discriminative Learning with Bigraph-Aware Alignment for Semi-Supervised 2D-3D Cross-Modal Retrieval Fan Zhang, Changhu Wang, Zebang Cheng, Xiaojiang Peng, Dongjie Wang, Yijia Xiao, Chong Chen, Xian-Sheng Hua, Xiao Luo
CVPRW 2025 Why We Feel: Breaking Boundaries in Emotional Reasoning with Multimodal Large Language Models Yuxiang Lin, Jingdong Sun, Zhi-Qi Cheng, Jue Wang, Haomin Liang, Zebang Cheng, Yifei Dong, Jun-Yan He, Xiaojiang Peng, Xian-Sheng Hua
ECCV 2024 Dataset Growth Ziheng Qin, Zhaopan Xu, YuKun Zhou, Kai Wang, Zangwei Zheng, Zebang Cheng, Hao Tang, Lei Shang, Baigui Sun, Radu Timofte, Xiaojiang Peng, Hongxun Yao, Yang You
NeurIPS 2024 Emotion-Llama: Multimodal Emotion Recognition and Reasoning with Instruction Tuning Zebang Cheng, Zhi-Qi Cheng, Jun-Yan He, Jingdong Sun, Kai Wang, Yuxiang Lin, Zheng Lian, Xiaojiang Peng, Alexander G. Hauptmann