Calibrating a Deep Neural Network with Its Predecessors
Abstract
Confidence calibration - the process to calibrate the output probability distribution of neural networks - is essential for safety-critical applications of such networks. Recent works verify the link between mis-calibration and overfitting. However, early stopping, as a well-known technique to mitigate overfitting, fails to calibrate networks. In this work, we study the limitions of early stopping and comprehensively analyze the overfitting problem of a network considering each individual block. We then propose a novel regularization method, predecessor combination search (PCS), to improve calibration by searching a combination of best-fitting block predecessors, where block predecessors are the corresponding network blocks with weight parameters from earlier training stages. PCS achieves the state-of-the-art calibration performance on multiple datasets and architectures. In addition, PCS improves model robustness under dataset distribution shift. Supplementary material and code are available at https://github.com/Linwei94/PCS
Cite
Text
Tao et al. "Calibrating a Deep Neural Network with Its Predecessors." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/475Markdown
[Tao et al. "Calibrating a Deep Neural Network with Its Predecessors." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/tao2023ijcai-calibrating/) doi:10.24963/IJCAI.2023/475BibTeX
@inproceedings{tao2023ijcai-calibrating,
title = {{Calibrating a Deep Neural Network with Its Predecessors}},
author = {Tao, Linwei and Dong, Minjing and Liu, Daochang and Sun, Changming and Xu, Chang},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2023},
pages = {4271-4279},
doi = {10.24963/IJCAI.2023/475},
url = {https://mlanthology.org/ijcai/2023/tao2023ijcai-calibrating/}
}