Ye, Linfeng

5 publications

TMLR 2025 Distributed Quasi-Newton Method for Fair and Fast Federated Learning Shayan Mohajer Hamidi, Linfeng Ye
TMLR 2025 Towards Undistillable Models by Minimizing Conditional Mutual Information Linfeng Ye, Shayan Mohajer Hamidi, En-Hui Yang
ICLR 2024 Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information Linfeng Ye, Shayan Mohajer Hamidi, Renhao Tan, En-Hui Yang
ECCV 2024 How to Train the Teacher Model for Effective Knowledge Distillation Shayan Mohajer Hamidi, Xizhen Deng, Renhao Tan, Linfeng Ye, Ahmed Hussein Salamah
ECCV 2024 Markov Knowledge Distillation: Make Nasty Teachers Trained by Self-Undermining Knowledge Distillation Fully Distillable En-hui Yang, Linfeng Ye