Derezinski, Michal

24 publications

JMLR 2025 Fine-Grained Analysis and Faster Algorithms for Iteratively Solving Linear Systems Michal DereziƄski, Daniel LeJeune, Deanna Needell, Elizaveta Rebrova
TMLR 2025 Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches Michal Derezinski
NeurIPS 2025 Turbocharging Gaussian Process Inference with Approximate Sketch-and-Project Pratik Rathore, Zachary Frangella, Sachin Garg, Shaghayegh Fazliani, Michal Derezinski, Madeleine Udell
NeurIPSW 2024 HERTA: A High-Efficiency and Rigorous Training Algorithm for Unfolded Graph Neural Networks Yongyi Yang, Jiaming Yang, Wei Hu, Michal Derezinski
NeurIPSW 2023 Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches Michal Derezinski
IJCAI 2021 Improved Guarantees and a Multiple-Descent Curve for Column Subset Selection and the Nystrom Method (Extended Abstract) Michal Derezinski, Rajiv Khanna, Michael W. Mahoney
NeurIPS 2021 Newton-LESS: Sparsification Without Trade-Offs for the Sketched Newton Update Michal Derezinski, Jonathan Lacotte, Mert Pilanci, Michael W. Mahoney
COLT 2021 Query Complexity of Least Absolute Deviation Regression via Robust Uniform Convergence Xue Chen, Michal Derezinski
COLT 2021 Sparse Sketches with Small Inversion Bias Michal Derezinski, Zhenyu Liao, Edgar Dobriban, Michael Mahoney
AISTATS 2020 Bayesian Experimental Design Using Regularized Determinantal Point Processes Michal Derezinski, Feynman Liang, Michael Mahoney
AISTATS 2020 Convergence Analysis of Block Coordinate Algorithms with Determinantal Sampling Mojmir Mutny, Michal Derezinski, Andreas Krause
NeurIPS 2020 Debiasing Distributed Second Order Optimization with Surrogate Sketching and Scaled Regularization Michal Derezinski, Burak Bartan, Mert Pilanci, Michael W. Mahoney
NeurIPS 2020 Exact Expressions for Double Descent and Implicit Regularization via Surrogate Random Design Michal Derezinski, Feynman T Liang, Michael W. Mahoney
NeurIPS 2020 Improved Guarantees and a Multiple-Descent Curve for Column Subset Selection and the Nystrom Method Michal Derezinski, Rajiv Khanna, Michael W. Mahoney
NeurIPS 2020 Precise Expressions for Random Projections: Low-Rank Approximation and Randomized Newton Michal Derezinski, Feynman T Liang, Zhenyu Liao, Michael W. Mahoney
NeurIPS 2020 Sampling from a K-DPP Without Looking at All Items Daniele Calandriello, Michal Derezinski, Michal Valko
AISTATS 2019 Correcting the Bias in Least Squares Regression with Volume-Rescaled Sampling Michal Derezinski, Manfred K. Warmuth, Daniel Hsu
NeurIPS 2019 Distributed Estimation of the Inverse Hessian by Determinantal Averaging Michal Derezinski, Michael W. Mahoney
NeurIPS 2019 Exact Sampling of Determinantal Point Processes with Sublinear Time Preprocessing Michal Derezinski, Daniele Calandriello, Michal Valko
AISTATS 2018 Batch-Expansion Training: An Efficient Optimization Framework Michal Derezinski, Dhruv Mahajan, S. Sathiya Keerthi, S. V. N. Vishwanathan, Markus Weimer
NeurIPS 2018 Leveraged Volume Sampling for Linear Regression Michal Derezinski, Manfred K. Warmuth, Daniel J. Hsu
AISTATS 2018 Subsampling for Ridge Regression via Regularized Volume Sampling Michal Derezinski, Manfred K. Warmuth
NeurIPS 2017 Unbiased Estimates for Linear Regression via Volume Sampling Michal Derezinski, Manfred K. Warmuth
NeurIPS 2014 The Limits of Squared Euclidean Distance Regularization Michal Derezinski, Manfred K. Warmuth