Fu, Daniel Y.

13 publications

NeurIPS 2025 Exploring Diffusion Transformer Designs via Grafting Keshigeyan Chandrasegaran, Michael Poli, Daniel Y Fu, Dongjun Kim, Lea M. Hadzic, Manling Li, Agrim Gupta, Stefano Massaroli, Azalia Mirhoseini, Juan Carlos Niebles, Stefano Ermon, Li Fei-Fei
CVPR 2025 HMAR: Efficient Hierarchical Masked Auto-Regressive Image Generation Hermann Kumbong, Xian Liu, Tsung-Yi Lin, Ming-Yu Liu, Xihui Liu, Ziwei Liu, Daniel Y. Fu, Christopher Re, David W. Romero
ICLR 2025 ThunderKittens: Simple, Fast, and $\textit{Adorable}$ Kernels Benjamin Frederick Spector, Simran Arora, Aaryan Singhal, Arjun Parthasarathy, Daniel Y Fu, Christopher Re
ICML 2024 Benchmarking and Building Long-Context Retrieval Models with LoCo and M2-BERT Jon Saad-Falcon, Daniel Y Fu, Simran Arora, Neel Guha, Christopher Re
ICLR 2024 FlashFFTConv: Efficient Convolutions for Long Sequences with Tensor Cores Daniel Y Fu, Hermann Kumbong, Eric Nguyen, Christopher Re
ICMLW 2024 Hydragen: High-Throughput LLM Inference with Shared Prefixes Jordan Juravsky, Bradley Brown, Ryan Saul Ehrlich, Daniel Y Fu, Christopher Re, Azalia Mirhoseini
NeurIPS 2024 RedPajama: An Open Dataset for Training Large Language Models Maurice Weber, Daniel Y. Fu, Quentin Anthony, Yonatan Oren, Shane Adams, Anton Alexandrov, Xiaozhong Lyu, Huu Nguyen, Xiaozhe Yao, Virginia Adams, Ben Athiwaratkun, Rahul Chalamala, Kezhen Chen, Max Ryabinin, Tri Dao, Percy Liang, Christopher Ré, Irina Rish, Ce Zhang
ICLR 2023 Hungry Hungry Hippos: Towards Language Modeling with State Space Models Daniel Y Fu, Tri Dao, Khaled Kamal Saab, Armin W Thomas, Atri Rudra, Christopher Re
ICML 2023 Hyena Hierarchy: Towards Larger Convolutional Language Models Michael Poli, Stefano Massaroli, Eric Nguyen, Daniel Y Fu, Tri Dao, Stephen Baccus, Yoshua Bengio, Stefano Ermon, Christopher Re
ICML 2023 Simple Hardware-Efficient Long Convolutions for Sequence Modeling Daniel Y Fu, Elliot L Epstein, Eric Nguyen, Armin W Thomas, Michael Zhang, Tri Dao, Atri Rudra, Christopher Re
ICLRW 2023 Simple Hardware-Efficient Long Convolutions for Sequence Modeling Daniel Y Fu, Elliot L Epstein, Eric Nguyen, Armin W Thomas, Michael Zhang, Tri Dao, Atri Rudra, Christopher Re
ICML 2022 Perfectly Balanced: Improving Transfer and Robustness of Supervised Contrastive Learning Mayee Chen, Daniel Y Fu, Avanika Narayan, Michael Zhang, Zhao Song, Kayvon Fatahalian, Christopher Re
UAI 2022 Shoring up the Foundations: Fusing Model Embeddings and Weak Supervision Mayee F. Chen, Daniel Y. Fu, Dyah Adila, Michael Zhang, Frederic Sala, Kayvon Fatahalian, Christopher Ré