Yang, Greg

20 publications

ICML 2025 Global Convergence and Rich Feature Learning in $l$-Layer Infinite-Width Neural Networks Under $μ$ Parametrization Zixiang Chen, Greg Yang, Qingyue Zhao, Quanquan Gu
ICLR 2024 Tensor Programs VI: Feature Learning in Infinite Depth Neural Networks Greg Yang, Dingli Yu, Chen Zhu, Soufiane Hayou
ICLR 2023 Adaptive Optimization in the $\infty$-Width Limit Etai Littwin, Greg Yang
NeurIPSW 2023 Feature Learning in Infinite-Depth Neural Networks Greg Yang, Dingli Yu, Chen Zhu, Soufiane Hayou
ICML 2023 Width and Depth Limits Commute in Residual Networks Soufiane Hayou, Greg Yang
NeurIPS 2022 3DB: A Framework for Debugging Computer Vision Models Guillaume Leclerc, Hadi Salman, Andrew Ilyas, Sai Vemprala, Logan Engstrom, Vibhav Vineet, Kai Xiao, Pengchuan Zhang, Shibani Santurkar, Greg Yang, Ashish Kapoor, Aleksander Madry
ICLR 2022 Efficient Computation of Deep Nonlinear Infinite-Width Neural Networks That Learn Features Greg Yang, Michael Santacroce, Edward J Hu
NeurIPS 2022 High-Dimensional Asymptotics of Feature Learning: How One Gradient Step Improves the Representation Jimmy Ba, Murat A Erdogdu, Taiji Suzuki, Zhichao Wang, Denny Wu, Greg Yang
NeurIPS 2022 Non-Gaussian Tensor Programs Eugene Golikov, Greg Yang
ICML 2021 Tensor Programs IIb: Architectural Universality of Neural Tangent Kernel Training Dynamics Greg Yang, Etai Littwin
ICML 2021 Tensor Programs IV: Feature Learning in Infinite-Width Neural Networks Greg Yang, Edward J. Hu
NeurIPS 2020 Denoised Smoothing: A Provable Defense for Pretrained Classifiers Hadi Salman, Mingjie Sun, Greg Yang, Ashish Kapoor, J. Zico Kolter
NeurIPS 2020 On Infinite-Width Hypernetworks Etai Littwin, Tomer Galanti, Lior Wolf, Greg Yang
ICML 2020 Randomized Smoothing of All Shapes and Sizes Greg Yang, Tony Duan, J. Edward Hu, Hadi Salman, Ilya Razenshteyn, Jerry Li
NeurIPS 2019 A Convex Relaxation Barrier to Tight Robustness Verification of Neural Networks Hadi Salman, Greg Yang, Huan Zhang, Cho-Jui Hsieh, Pengchuan Zhang
ICLR 2019 A Mean Field Theory of Batch Normalization Greg Yang, Jeffrey Pennington, Vinay Rao, Jascha Sohl-Dickstein, Samuel S. Schoenholz
ICLR 2019 Bayesian Deep Convolutional Networks with Many Channels Are Gaussian Processes Roman Novak, Lechao Xiao, Yasaman Bahri, Jaehoon Lee, Greg Yang, Jiri Hron, Daniel A. Abolafia, Jeffrey Pennington, Jascha Sohl-dickstein
NeurIPS 2019 Provably Robust Deep Learning via Adversarially Trained Smoothed Classifiers Hadi Salman, Jerry Li, Ilya Razenshteyn, Pengchuan Zhang, Huan Zhang, Sebastien Bubeck, Greg Yang
NeurIPS 2019 Wide Feedforward or Recurrent Neural Networks of Any Architecture Are Gaussian Processes Greg Yang
ICLR 2017 Lie-Access Neural Turing Machines Greg Yang, Alexander M. Rush