ML Anthology
Authors
Search
About
Holzmüller, David
12 publications
ICLR
2025
Active Learning for Neural PDE Solvers
Daniel Musekamp
,
Marimuthu Kalimuthu
,
David Holzmüller
,
Makoto Takamoto
,
Mathias Niepert
JMLR
2025
Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation
David Holzmüller
,
Francis Bach
TMLR
2025
LOGLO-FNO: Efficient Learning of Local and Global Features in Fourier Neural Operators
Marimuthu Kalimuthu
,
David Holzmüller
,
Mathias Niepert
ICLRW
2025
LOGLO-FNO: Efficient Learning of Local and Global Features in Fourier Neural Operators
Marimuthu Kalimuthu
,
David Holzmüller
,
Mathias Niepert
NeurIPS
2025
TabArena: A Living Benchmark for Machine Learning on Tabular Data
Nick Erickson
,
Lennart Purucker
,
Andrej Tschalzev
,
David Holzmüller
,
Prateek Mutalik Desai
,
David Salinas
,
Frank Hutter
ICML
2025
TabICL: A Tabular Foundation Model for In-Context Learning on Large Data
Jingang Qu
,
David Holzmüller
,
Gaël Varoquaux
,
Marine Le Morvan
NeurIPSW
2024
Active Learning for Neural PDE Solvers
Daniel Musekamp
,
Marimuthu Kalimuthu
,
David Holzmüller
,
Makoto Takamoto
,
Mathias Niepert
NeurIPS
2024
Better by Default: Strong Pre-Tuned MLPs and Boosted Trees on Tabular Data
David Holzmüller
,
Léo Grinsztajn
,
Ingo Steinwart
JMLR
2023
A Framework and Benchmark for Deep Batch Active Learning for Regression
David Holzmüller
,
Viktor Zaverkin
,
Johannes Kästner
,
Ingo Steinwart
NeurIPS
2023
Mind the Spikes: Benign Overfitting of Kernels and Neural Networks in Fixed Dimension
Moritz Haas
,
David Holzmüller
,
Ulrike V. Luxburg
,
Ingo Steinwart
JMLR
2022
Training Two-Layer ReLU Networks with Gradient Descent Is Inconsistent
David Holzmüller
,
Ingo Steinwart
ICLR
2021
On the Universality of the Double Descent Peak in Ridgeless Regression
David Holzmüller