Akiyama, Shunta

9 publications

NeurIPS 2025 Block Coordinate Descent for Neural Networks Provably Finds Global Minima Shunta Akiyama
ICML 2025 Survival Analysis via Density Estimation Hiroki Yanagisawa, Shunta Akiyama
ICML 2024 SILVER: Single-Loop Variance Reduction and Application to Federated Learning Kazusato Oko, Shunta Akiyama, Denny Wu, Tomoya Murata, Taiji Suzuki
ICML 2023 Diffusion Models Are Minimax Optimal Distribution Estimators Kazusato Oko, Shunta Akiyama, Taiji Suzuki
ICLRW 2023 Diffusion Models Are Minimax Optimal Distribution Estimators Kazusato Oko, Shunta Akiyama, Taiji Suzuki
ICLR 2023 Excess Risk of Two-Layer ReLU Neural Networks in Teacher-Student Settings and Its Superiority to Kernel Methods Shunta Akiyama, Taiji Suzuki
NeurIPSW 2022 Reducing Communication in Nonconvex Federated Learning with a Novel Single-Loop Variance Reduction Method Kazusato Oko, Shunta Akiyama, Tomoya Murata, Taiji Suzuki
ICLR 2021 Benefit of Deep Learning with Non-Convex Noisy Gradient Descent: Provable Excess Risk Bound and Superiority to Kernel Methods Taiji Suzuki, Shunta Akiyama
ICML 2021 On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting Shunta Akiyama, Taiji Suzuki