Calibrated Reliable Regression Using Maximum Mean Discrepancy

Abstract

Accurate quantification of uncertainty is crucial for real-world applications of machine learning. However, modern deep neural networks still produce unreliable predictive uncertainty, often yielding over-confident predictions. In this paper, we are concerned with getting well-calibrated predictions in regression tasks. We propose the calibrated regression method using the maximum mean discrepancy by minimizing the kernel embedding measure. Theoretically, the calibration error of our method asymptotically converges to zero when the sample size is large enough. Experiments on non-trivial real datasets show that our method can produce well-calibrated and sharp prediction intervals, which outperforms the related state-of-the-art methods.

Cite

Text

Cui et al. "Calibrated Reliable Regression Using Maximum Mean Discrepancy." Neural Information Processing Systems, 2020.

Markdown

[Cui et al. "Calibrated Reliable Regression Using Maximum Mean Discrepancy." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/cui2020neurips-calibrated/)

BibTeX

@inproceedings{cui2020neurips-calibrated,
  title     = {{Calibrated Reliable Regression Using Maximum Mean Discrepancy}},
  author    = {Cui, Peng and Hu, Wenbo and Zhu, Jun},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/cui2020neurips-calibrated/}
}