Calibrating Deep Ensemble Through Functional Variational Inference
Abstract
Deep Ensemble (DE) is an effective and practical uncertainty quantification approach in deep learning. The uncertainty of DE is usually manifested by the functional inconsistency among the ensemble members, which, yet, originates from unmanageable randomness in the initialization and optimization of neural networks (NNs), and may easily collapse in specific cases. To tackle this issue, we advocate characterizing the functional inconsistency with the empirical covariance of the functions dictated by the ensemble members, and defining a Gaussian process (GP) with it. We perform functional variational inference to tune such a probabilistic model w.r.t. training data and specific prior beliefs. This way, we can explicitly manage the uncertainty of the ensemble of NNs. We further provide strategies to make the training efficient. The proposed approach achieves better uncertainty quantification than DE and its variants across diverse scenarios, while consuming only marginally added training cost compared to standard DE. The code is available at https://github.com/thudzj/DE-GP.
Cite
Text
Deng et al. "Calibrating Deep Ensemble Through Functional Variational Inference." Transactions on Machine Learning Research, 2024.Markdown
[Deng et al. "Calibrating Deep Ensemble Through Functional Variational Inference." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/deng2024tmlr-calibrating/)BibTeX
@article{deng2024tmlr-calibrating,
title = {{Calibrating Deep Ensemble Through Functional Variational Inference}},
author = {Deng, Zhijie and Zhou, Feng and Chen, Jianfei and Wu, Guoqiang and Zhu, Jun},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/deng2024tmlr-calibrating/}
}