Oko, Kazusato

20 publications

ICLR 2025 Direct Distributional Optimization for Provable Alignment of Diffusion Models Ryotaro Kawata, Kazusato Oko, Atsushi Nitanda, Taiji Suzuki
ICLR 2025 Flow Matching Achieves Almost Minimax Optimal Convergence Kenji Fukumizu, Taiji Suzuki, Noboru Isobe, Kazusato Oko, Masanori Koyama
ICML 2025 Nonlinear Transformers Can Perform Inference-Time Feature Learning Naoki Nishikawa, Yujin Song, Kazusato Oko, Denny Wu, Taiji Suzuki
ICLR 2024 Improved Statistical and Computational Complexity of the Mean-Field Langevin Dynamics Under Structured Data Atsushi Nitanda, Kazusato Oko, Taiji Suzuki, Denny Wu
COLT 2024 Learning Sum of Diverse Features: Computational Hardness and Efficient Gradient-Based Training for Ridge Combinations Kazusato Oko, Yujin Song, Taiji Suzuki, Denny Wu
ICML 2024 Mean Field Langevin Actor-Critic: Faster Convergence and Global Optimality Beyond Lazy Learning Kakei Yamamoto, Kazusato Oko, Zhuoran Yang, Taiji Suzuki
ICMLW 2024 Neural Network Learns Low-Dimensional Polynomials with SGD near the Information-Theoretic Limit Jason D. Lee, Kazusato Oko, Taiji Suzuki, Denny Wu
NeurIPS 2024 Neural Network Learns Low-Dimensional Polynomials with SGD near the Information-Theoretic Limit Jason D. Lee, Kazusato Oko, Taiji Suzuki, Denny Wu
NeurIPS 2024 Pretrained Transformer Efficiently Learns Low-Dimensional Target Functions In-Context Kazusato Oko, Yujin Song, Taiji Suzuki, Denny Wu
ICML 2024 SILVER: Single-Loop Variance Reduction and Application to Federated Learning Kazusato Oko, Shunta Akiyama, Denny Wu, Tomoya Murata, Taiji Suzuki
ICLR 2024 Symmetric Mean-Field Langevin Dynamics for Distributional Minimax Problems Juno Kim, Kakei Yamamoto, Kazusato Oko, Zhuoran Yang, Taiji Suzuki
ICMLW 2024 Transformer Efficiently Learns Low-Dimensional Target Functions In-Context Yujin Song, Denny Wu, Kazusato Oko, Taiji Suzuki
ICML 2023 Diffusion Models Are Minimax Optimal Distribution Estimators Kazusato Oko, Shunta Akiyama, Taiji Suzuki
ICLRW 2023 Diffusion Models Are Minimax Optimal Distribution Estimators Kazusato Oko, Shunta Akiyama, Taiji Suzuki
NeurIPS 2023 Feature Learning via Mean-Field Langevin Dynamics: Classifying Sparse Parities and Beyond Taiji Suzuki, Denny Wu, Kazusato Oko, Atsushi Nitanda
NeurIPSW 2023 How Structured Data Guides Feature Learning: A Case Study of the Parity Problem Atsushi Nitanda, Kazusato Oko, Taiji Suzuki, Denny Wu
ICML 2023 Primal and Dual Analysis of Entropic Fictitious Play for Finite-Sum Problems Atsushi Nitanda, Kazusato Oko, Denny Wu, Nobuhito Takenouchi, Taiji Suzuki
NeurIPSW 2023 Symmetric Mean-Field Langevin Dynamics for Distributional Minimax Problems Juno Kim, Kakei Yamamoto, Kazusato Oko, Zhuoran Yang, Taiji Suzuki
ICLR 2022 Particle Stochastic Dual Coordinate Ascent: Exponential Convergent Algorithm for Mean Field Neural Network Optimization Kazusato Oko, Taiji Suzuki, Atsushi Nitanda, Denny Wu
NeurIPSW 2022 Reducing Communication in Nonconvex Federated Learning with a Novel Single-Loop Variance Reduction Method Kazusato Oko, Shunta Akiyama, Tomoya Murata, Taiji Suzuki