Dai, Xiangxiang

2 publications

ICLR 2025 Demystifying Online Clustering of Bandits: Enhanced Exploration Under Stochastic and Smoothed Adversarial Contexts Zhuohua Li, Maoli Liu, Xiangxiang Dai, John C.S. Lui
ICML 2025 Offline Learning for Combinatorial Multi-Armed Bandits Xutong Liu, Xiangxiang Dai, Jinhang Zuo, Siwei Wang, Carlee Joe-Wong, John C.S. Lui, Wei Chen