Taylor, Adrien

11 publications

ICML 2025 The Surprising Agreement Between Convex Optimization Theory and Learning-Rate Scheduling for Large Model Training Fabian Schaipp, Alexander Hägele, Adrien Taylor, Umut Simsekli, Francis Bach
NeurIPS 2025 Tight Analyses of First-Order Methods with Error Feedback Daniel Berg Thomsen, Adrien Taylor, Aymeric Dieuleveut
ICLR 2024 Leveraging Augmented-Lagrangian Techniques for Differentiating over Infeasible Quadratic Programs in Machine Learning Antoine Bambade, Fabian Schramm, Adrien Taylor, Justin Carpentier
ICML 2023 Convergence of Proximal Point and Extragradient-Based Methods Beyond Monotonicity: The Case of Negative Comonotonicity Eduard Gorbunov, Adrien Taylor, Samuel Horváth, Gauthier Gidel
NeurIPS 2022 Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm Under Parallelization Benjamin Dubois-Taine, Francis R. Bach, Quentin Berthet, Adrien Taylor
NeurIPS 2022 Last-Iterate Convergence of Optimistic Gradient Method for Monotone Variational Inequalities Eduard Gorbunov, Adrien Taylor, Gauthier Gidel
NeurIPSW 2022 Quadratic Minimization: From Conjugate Gradients to an Adaptive Heavy-Ball Method with Polyak Step-Sizes Baptiste Goujaud, Adrien Taylor, Aymeric Dieuleveut
NeurIPS 2021 Continuized Accelerations of Deterministic and Stochastic Gradient Descents, and of Gossip Algorithms Mathieu Even, Raphaël Berthier, Francis R. Bach, Nicolas Flammarion, Hadrien Hendrikx, Pierre Gaillard, Laurent Massoulié, Adrien Taylor
COLT 2020 Complexity Guarantees for Polyak Steps with Momentum Mathieu Barré, Adrien Taylor, Alexandre d’Aspremont
COLT 2019 Stochastic First-Order Methods: Non-Asymptotic and Computer-Aided Analyses via Potential Functions Adrien Taylor, Francis Bach
ICML 2018 Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees Adrien Taylor, Bryan Van Scoy, Laurent Lessard