Mollenhauer, Mattes

7 publications

TMLR 2025 Choose Your Model Size: Any Compression of Large Language Models Without Re-Computation Martin Genzel, Patrick Putzky, Pengfei Zhao, Sebastian Schulze, Mattes Mollenhauer, Robert Seidel, Stefan Dietzel, Thomas Wollmann
NeurIPS 2025 Regularized Least Squares Learning with Heavy-Tailed Noise Is Minimax Optimal Mattes Mollenhauer, Nicole Mücke, Dimitri Meunier, Arthur Gretton
NeurIPS 2024 Optimal Rates for Vector-Valued Spectral Regularization Learning Algorithms Dimitri Meunier, Zikai Shen, Mattes Mollenhauer, Arthur Gretton, Zhu Li
JMLR 2024 Towards Optimal Sobolev Norm Rates for the Vector-Valued Regularized Least-Squares Algorithm Zhu Li, Dimitri Meunier, Mattes Mollenhauer, Arthur Gretton
JMLR 2022 Kernel Autocovariance Operators of Stationary Processes: Estimation and Convergence Mattes Mollenhauer, Stefan Klus, Christof Schütte, Péter Koltai
NeurIPS 2022 Optimal Rates for Regularized Conditional Mean Embedding Learning Zhu Li, Dimitri Meunier, Mattes Mollenhauer, Arthur Gretton
AISTATS 2020 Kernel Conditional Density Operators Ingmar Schuster, Mattes Mollenhauer, Stefan Klus, Krikamol Muandet