Fast Multi-Instance Partial-Label Learning
Abstract
Multi-instance partial-label learning (MIPL) is a paradigm where each training example is encapsulated as a multi-instance bag associated with the candidate label set, which includes one true label and several false positives. Current MIPL algorithms typically assume that all instances are independent, thereby neglecting the dependencies and heterogeneity inherent in MIPL data. Moreover, these algorithms often prove to be excessively time-consuming when dealing with complex datasets, significantly limiting the practical application of MIPL. In this paper, we propose FastMIPL, a framework that employs mixed-effects model to explicitly capture the dependencies and heterogeneity among instances and bags. FastMIPL is able to learn from MIPL data both effectively and efficiently by utilizing the predefined dependencies modeling module and leveraging the posterior predictive probability disambiguation strategy. Experiments show that the performance of FastMIPL is highly competitive to state-of-the-art methods, while significantly reducing computational time in benchmark and the real-world datasets.
Cite
Text
Yang et al. "Fast Multi-Instance Partial-Label Learning." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I21.34356Markdown
[Yang et al. "Fast Multi-Instance Partial-Label Learning." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/yang2025aaai-fast/) doi:10.1609/AAAI.V39I21.34356BibTeX
@inproceedings{yang2025aaai-fast,
title = {{Fast Multi-Instance Partial-Label Learning}},
author = {Yang, Yin-Fang and Tang, Wei and Zhang, Min-Ling},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {22038-22046},
doi = {10.1609/AAAI.V39I21.34356},
url = {https://mlanthology.org/aaai/2025/yang2025aaai-fast/}
}