A Two-Stage Learning-to-Defer Approach for Multi-Task Learning
Abstract
The Two-Stage Learning-to-Defer (L2D) framework has been extensively studied for classification and, more recently, regression tasks. However, many real-world applications require solving both tasks jointly in a multi-task setting. We introduce a novel Two-Stage L2D framework for multi-task learning that integrates classification and regression through a unified deferral mechanism. Our method leverages a two-stage surrogate loss family, which we prove to be both Bayes-consistent and $(\mathcal{G}, \mathcal{R})$-consistent, ensuring convergence to the Bayes-optimal rejector. We derive explicit consistency bounds tied to the cross-entropy surrogate and the $L_1$-norm of agent-specific costs, and extend minimizability gap analysis to the multi-expert two-stage regime. We also make explicit how shared representation learning—commonly used in multi-task models—affects these consistency guarantees. Experiments on object detection and electronic health record analysis demonstrate the effectiveness of our approach and highlight the limitations of existing L2D methods in multi-task scenarios.
Cite
Text
Montreuil et al. "A Two-Stage Learning-to-Defer Approach for Multi-Task Learning." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Montreuil et al. "A Two-Stage Learning-to-Defer Approach for Multi-Task Learning." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/montreuil2025icml-twostage/)BibTeX
@inproceedings{montreuil2025icml-twostage,
title = {{A Two-Stage Learning-to-Defer Approach for Multi-Task Learning}},
author = {Montreuil, Yannis and Heng, Yeo Shu and Carlier, Axel and Ng, Lai Xing and Ooi, Wei Tsang},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {44726-44749},
volume = {267},
url = {https://mlanthology.org/icml/2025/montreuil2025icml-twostage/}
}