End-to-End Conformal Calibration for Optimization Under Uncertainty
Abstract
Machine learning can significantly improve performance for decision-making under uncertainty across a wide range of domains. However, ensuring robustness guarantees requires well-calibrated uncertainty estimates, which can be difficult to achieve with neural networks. Moreover, in high-dimensional settings, there may be many valid uncertainty estimates, each with its own performance profile—i.e., not all uncertainty is equally valuable for downstream decision-making. To address this problem, this paper develops an end-to-end framework to _learn_ uncertainty sets for conditional robust optimization in a way that is informed by the downstream decision-making loss, with robustness and calibration guarantees provided by conformal prediction. In addition, we propose to represent general convex uncertainty sets with partially input-convex neural networks, which are learned as part of our framework. Our approach consistently improves upon two-stage estimate-then-optimize baselines on concrete applications in energy storage arbitrage and portfolio optimization.
Cite
Text
Yeh et al. "End-to-End Conformal Calibration for Optimization Under Uncertainty." Transactions on Machine Learning Research, 2025.Markdown
[Yeh et al. "End-to-End Conformal Calibration for Optimization Under Uncertainty." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/yeh2025tmlr-endtoend/)BibTeX
@article{yeh2025tmlr-endtoend,
title = {{End-to-End Conformal Calibration for Optimization Under Uncertainty}},
author = {Yeh, Christopher and Christianson, Nicolas and Wu, Alan and Wierman, Adam and Yue, Yisong},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/yeh2025tmlr-endtoend/}
}