Nitanda, Atsushi
38 publications
ICML
2025
Propagation of Chaos for Mean-Field Langevin Dynamics and Its Application to Model Ensemble
NeurIPS
2025
Statistical Analysis of the Sinkhorn Iterations for Two-Sample Schr\"odinger Bridge Estimation
NeurIPS
2024
Provably Transformers Harness Multi-Concept Word Semantics for Efficient In-Context Learning
NeurIPS
2023
Convergence of Mean-Field Langevin Dynamics: Time-Space Discretization, Stochastic Gradient, and Variance Reduction
NeurIPS
2023
Feature Learning via Mean-Field Langevin Dynamics: Classifying Sparse Parities and Beyond
NeurIPS
2022
Two-Layer Neural Network on Infinite Dimensional Data: Global Optimization Guarantee in the Mean-Field Regime
AISTATS
2021
Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features
NeurIPS
2021
Deep Learning Is Adaptive to Intrinsic Dimensionality of Model Smoothness in Anisotropic Besov Space
NeurIPS
2021
Generalization Bounds for Graph Embedding Using Negative Sampling: Linear vs Hyperbolic
NeurIPS
2021
Particle Dual Averaging: Optimization of Mean Field Neural Network with Global Convergence Rate Analysis
AISTATS
2020
Functional Gradient Boosting for Learning Residual-like Networks with Statistical Guarantees
AISTATS
2019
Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors
AISTATS
2018
Gradient Layer: Enhancing the Convergence of Adversarial Training for Generative Models