ML Anthology
Authors
Search
About
Shao, Shitong
12 publications
CVPR
2025
DELT: A Simple Diversity-Driven EarlyLate Training for Dataset Distillation
Zhiqiang Shen
,
Ammar Sherif
,
Zeyuan Yin
,
Shitong Shao
ICCV
2025
Golden Noise for Diffusion Models: A Learning Framework
Zikai Zhou
,
Shitong Shao
,
Lichen Bai
,
Shufei Zhang
,
Zhiqiang Xu
,
Bo Han
,
Zeke Xie
ICLR
2025
IV-Mixed Sampler: Leveraging Image Diffusion Models for Enhanced Video Synthesis
Shitong Shao
,
Zikai Zhou
,
Bai LiChen
,
Haoyi Xiong
,
Zeke Xie
ICLR
2025
Zigzag Diffusion Sampling: Diffusion Models Can Self-Improve via Self-Reflection
Bai LiChen
,
Shitong Shao
,
Zikai Zhou
,
Zipeng Qi
,
Zhiqiang Xu
,
Haoyi Xiong
,
Zeke Xie
ECCV
2024
Auto-DAS: Automated Proxy Discovery for Training-Free Distillation-Aware Architecture Search
Haosen Sun
,
Lujun Li
,
Peijie Dong
,
Zimian Wei
,
Shitong Shao
NeurIPS
2024
Diffusion Models Are Certifiably Robust Classifiers
Huanran Chen
,
Yinpeng Dong
,
Shitong Shao
,
Zhongkai Hao
,
Xiao Yang
,
Hang Su
,
Jun Zhu
NeurIPS
2024
Elucidating the Design Space of Dataset Condensation
Shitong Shao
,
Zikai Zhou
,
Huanran Chen
,
Zhiqiang Shen
CVPR
2024
Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching
Shitong Shao
,
Zeyuan Yin
,
Muxin Zhou
,
Xindong Zhang
,
Zhiqiang Shen
IJCAI
2024
Rethinking Centered Kernel Alignment in Knowledge Distillation
Zikai Zhou
,
Yunhang Shen
,
Shitong Shao
,
Linrui Gong
,
Shaohui Lin
IJCAI
2023
Teaching What You Should Teach: A Data-Based Distillation Method
Shitong Shao
,
Huanran Chen
,
Zhen Huang
,
Linrui Gong
,
Shuai Wang
,
Xinxiao Wu
ACML
2022
AIIR-MIX: Multi-Agent Reinforcement Learning Meets Attention Individual Intrinsic Reward Mixing Network
Wei Li
,
Weiyan Liu
,
Shitong Shao
,
Shiyi Huang
ECCVW
2022
Bootstrap Generalization Ability from Loss Landscape Perspective
Huanran Chen
,
Shitong Shao
,
Ziyi Wang
,
Zirui Shang
,
Jin Chen
,
Xiaofeng Ji
,
Xinxiao Wu