Cost-Aware Pre-Training for Multiclass Cost-Sensitive Deep Learning
Abstract
Deep learning has been one of the most prominent machine learning techniques nowadays, being the state-of-the-art on a broad range of applications where automatic feature extraction is needed. Many such applications also demand varying costs for different types of mis-classification errors, but it is not clear whether or how such cost information can be incorporated into deep learning to improve performance. In this work, we first design a novel loss function that embeds the cost information for the training stage of cost-sensitive deep learning. We then show that the loss function can also be integrated into the pre-training stage to conduct cost-aware feature extraction more effectively. Extensive experimental results justify the validity of the novel loss function for making existing deep learning models cost-sensitive, and demonstrate that our proposed model with cost-aware pre-training and training outperforms non-deep models and other deep models that digest the cost information in other stages. PDF
Cite
Text
Chung et al. "Cost-Aware Pre-Training for Multiclass Cost-Sensitive Deep Learning." International Joint Conference on Artificial Intelligence, 2016.Markdown
[Chung et al. "Cost-Aware Pre-Training for Multiclass Cost-Sensitive Deep Learning." International Joint Conference on Artificial Intelligence, 2016.](https://mlanthology.org/ijcai/2016/chung2016ijcai-cost/)BibTeX
@inproceedings{chung2016ijcai-cost,
title = {{Cost-Aware Pre-Training for Multiclass Cost-Sensitive Deep Learning}},
author = {Chung, Yu-An and Lin, Hsuan-Tien and Yang, Shao-Wen},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2016},
pages = {1411-1417},
url = {https://mlanthology.org/ijcai/2016/chung2016ijcai-cost/}
}