Consistent Algorithms for Multi-Label Classification with Macro-at-$k$ Metrics

Abstract

We consider the optimization of complex performance metrics in multi-label classification under the population utility framework. We mainly focus on metrics linearly decomposable into a sum of binary classification utilities applied separately to each label with an additional requirement of exactly $k$ labels predicted for each instance. These "macro-at-$k$" metrics possess desired properties for extreme classification problems with long tail labels. Unfortunately, the at-$k$ constraint couples the otherwise independent binary classification tasks, leading to a much more challenging optimization problem than standard macro-averages. We provide a statistical framework to study this problem, prove the existence and the form of the optimal classifier, and propose a statistically consistent and practical learning algorithm based on the Frank-Wolfe method. Interestingly, our main results concern even more general metrics being non-linear functions of label-wise confusion matrices. Empirical results provide evidence for the competitive performance of the proposed approach.

Cite

Text

Schultheis et al. "Consistent Algorithms for Multi-Label Classification with Macro-at-$k$ Metrics." International Conference on Learning Representations, 2024.

Markdown

[Schultheis et al. "Consistent Algorithms for Multi-Label Classification with Macro-at-$k$ Metrics." International Conference on Learning Representations, 2024.](https://mlanthology.org/iclr/2024/schultheis2024iclr-consistent/)

BibTeX

@inproceedings{schultheis2024iclr-consistent,
  title     = {{Consistent Algorithms for Multi-Label Classification with Macro-at-$k$ Metrics}},
  author    = {Schultheis, Erik and Kotlowski, Wojciech and Wydmuch, Marek and Babbar, Rohit and Borman, Strom and Dembczynski, Krzysztof},
  booktitle = {International Conference on Learning Representations},
  year      = {2024},
  url       = {https://mlanthology.org/iclr/2024/schultheis2024iclr-consistent/}
}