Structured Prediction with Stronger Consistency Guarantees
Abstract
We present an extensive study of surrogate losses for structured prediction supported by *$H$-consistency bounds*. These are recently introduced guarantees that are more relevant to learning than Bayes-consistency, since they are not asymptotic and since they take into account the hypothesis set $H$ used. We first show that no non-trivial $H$-consistency bound can be derived for widely used surrogate structured prediction losses. We then define several new families of surrogate losses, including *structured comp-sum losses* and *structured constrained losses*, for which we prove $H$-consistency bounds and thus Bayes-consistency. These loss functions readily lead to new structured prediction algorithms with stronger theoretical guarantees, based on their minimization. We describe efficient algorithms for minimizing several of these surrogate losses, including a new *structured logistic loss*.
Cite
Text
Mao et al. "Structured Prediction with Stronger Consistency Guarantees." Neural Information Processing Systems, 2023.Markdown
[Mao et al. "Structured Prediction with Stronger Consistency Guarantees." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/mao2023neurips-structured/)BibTeX
@inproceedings{mao2023neurips-structured,
title = {{Structured Prediction with Stronger Consistency Guarantees}},
author = {Mao, Anqi and Mohri, Mehryar and Zhong, Yutao},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/mao2023neurips-structured/}
}