Multi-Instance Partial-Label Learning with Margin Adjustment

Abstract

Multi-instance partial-label learning (MIPL) is an emerging learning framework where each training sample is represented as a multi-instance bag associated with a candidate label set. Existing MIPL algorithms often overlook the margins for attention scores and predicted probabilities, leading to suboptimal generalization performance. A critical issue with these algorithms is that the highest prediction probability of the classifier may appear on a non-candidate label. In this paper, we propose an algorithm named MIPLMA, i.e., Multi-Instance Partial-Label learning with Margin Adjustment, which adjusts the margins for attention scores and predicted probabilities. We introduce a margin-aware attention mechanism to dynamically adjust the margins for attention scores and propose a margin distributionloss to constrain the margins between the predicted probabilities on candidate and non-candidate label sets. Experimental results demonstrate the superior performance of MIPLMA over existing MIPL algorithms, as well as other well-established multi-instance learning algorithms and partial-label learning algorithms.

Cite

Text

Tang et al. "Multi-Instance Partial-Label Learning with Margin Adjustment." Neural Information Processing Systems, 2024. doi:10.52202/079017-0829

Markdown

[Tang et al. "Multi-Instance Partial-Label Learning with Margin Adjustment." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/tang2024neurips-multiinstance/) doi:10.52202/079017-0829

BibTeX

@inproceedings{tang2024neurips-multiinstance,
  title     = {{Multi-Instance Partial-Label Learning with Margin Adjustment}},
  author    = {Tang, Wei and Yang, Yin-Fang and Wang, Zhaofei and Zhang, Weijia and Zhang, Min-Ling},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0829},
  url       = {https://mlanthology.org/neurips/2024/tang2024neurips-multiinstance/}
}