Condensed Filter Tree for Cost-Sensitive Multi-Label Classification
Abstract
Different real-world applications of multi-label classification often demand different evaluation criteria. We formalize this demand with a general setup, cost-sensitive multi-label classification (CSMLC), which takes the evaluation criteria into account during learning. Nevertheless, most existing algorithms can only focus on optimizing a few specific evaluation criteria, and cannot systematically deal with different ones. In this paper, we propose a novel algorithm, called condensed filter tree (CFT), for optimizing any criteria in CSMLC. CFT is derived from reducing CSMLC to the famous filter tree algorithm for cost-sensitive multi-class classification via constructing the label powerset. We successfully cope with the difficulty of having exponentially many extended-classes within the powerset for representation, training and prediction by carefully designing the tree structure and focusing on the key nodes. Experimental results across many real-world datasets validate that CFT is competitive with special purpose algorithms on special criteria and reaches better performance on general criteria.
Cite
Text
Li and Lin. "Condensed Filter Tree for Cost-Sensitive Multi-Label Classification." International Conference on Machine Learning, 2014.Markdown
[Li and Lin. "Condensed Filter Tree for Cost-Sensitive Multi-Label Classification." International Conference on Machine Learning, 2014.](https://mlanthology.org/icml/2014/li2014icml-condensed/)BibTeX
@inproceedings{li2014icml-condensed,
title = {{Condensed Filter Tree for Cost-Sensitive Multi-Label Classification}},
author = {Li, Chun-Liang and Lin, Hsuan-Tien},
booktitle = {International Conference on Machine Learning},
year = {2014},
pages = {423-431},
volume = {32},
url = {https://mlanthology.org/icml/2014/li2014icml-condensed/}
}