van Baalen, Mart

13 publications

TMLR 2025 Mixture of Cache-Conditional Experts for Efficient Mobile Device Inference Andrii Skliar, Ties van Rozendaal, Romain Lepert, Todor Boinovski, Mart Van Baalen, Markus Nagel, Paul N. Whatmough, Babak Ehteshami Bejnordi
ICMLW 2024 GPTVQ: The Blessing of Dimensionality for LLM Quantization Mart Van Baalen, Andrey Kuzmin, Markus Nagel, Peter Couperus, Artem Bolshakov, Cedric Bastoul, Eric Mahurin, Tijmen Blankevoort, Paul Whatmough
ICMLW 2024 Rapid Switching and Multi-Adapter Fusion via Sparse High Rank Adapters Kartikeya Bhardwaj, Nilesh Prasad Pandey, Sweta Priyadarshi, Viswanath Ganapathy, Rafael Esteves, Shreya Kadambi, Shubhankar Borse, Paul Whatmough, Risheek Garrepalli, Mart Van Baalen, Harris Teague, Markus Nagel
NeurIPS 2024 Sparse High Rank Adapters Kartikeya Bhardwaj, Nilesh Prasad Pandey, Sweta Priyadarshi, Viswanath Ganapathy, Shreya Kadambi, Rafael Esteves, Shubhankar Borse, Paul Whatmough, Risheek Garrepalli, Mart Van Baalen, Harris Teague, Markus Nagel
ICLR 2024 The LLM Surgeon Tycho F. A. van der Ouderaa, Markus Nagel, Mart Van Baalen, Tijmen Blankevoort
NeurIPS 2023 Pruning vs Quantization: Which Is Better? Andrey Kuzmin, Markus Nagel, Mart van Baalen, Arash Behboodi, Tijmen Blankevoort
ICCVW 2023 QBitOpt: Fast and Accurate Bitwidth Reallocation During Training Jorn Peters, Marios Fournarakis, Markus Nagel, Mart van Baalen, Tijmen Blankevoort
CVPRW 2022 Cyclical Pruning for Sparse Neural Networks Suraj Srinivas, Andrey Kuzmin, Markus Nagel, Mart van Baalen, Andrii Skliar, Tijmen Blankevoort
NeurIPS 2022 FP8 Quantization: The Power of the Exponent Andrey Kuzmin, Mart van Baalen, Yuwei Ren, Markus Nagel, Jorn Peters, Tijmen Blankevoort
CVPRW 2022 Simulated Quantization, Real Power Savings Mart van Baalen, Brian Kahne, Eric Mahurin, Andrey Kuzmin, Andrii Skliar, Markus Nagel, Tijmen Blankevoort
NeurIPS 2020 Bayesian Bits: Unifying Quantization and Pruning Mart van Baalen, Christos Louizos, Markus Nagel, Rana Ali Amjad, Ying Wang, Tijmen Blankevoort, Max Welling
ICLR 2020 Gradient $\ell_1$ Regularization for Quantization Robustness Milad Alizadeh, Arash Behboodi, Mart van Baalen, Christos Louizos, Tijmen Blankevoort, Max Welling
ICML 2020 Up or Down? Adaptive Rounding for Post-Training Quantization Markus Nagel, Rana Ali Amjad, Mart Van Baalen, Christos Louizos, Tijmen Blankevoort