Label-Noise Robust Logistic Regression and Its Applications
Abstract
The classical problem of learning a classifier relies on a set of labelled examples, without ever questioning the correctness of the provided label assignments. However, there is an increasing realisation that labelling errors are not uncommon in real situations. In this paper we consider a label-noise robust version of the logistic regression and multinomial logistic regression classifiers and develop the following contributions: (i) We derive efficient multiplicative updates to estimate the label flipping probabilities, and we give a proof of convergence for our algorithm. (ii) We develop a novel sparsity-promoting regularisation approach which allows us to tackle challenging high dimensional noisy settings. (iii) Finally, we throughly evaluate the performance of our approach in synthetic experiments and we demonstrate several real applications including gene expression analysis, class topology discovery and learning from crowdsourcing data.
Cite
Text
Bootkrajang and Kabán. "Label-Noise Robust Logistic Regression and Its Applications." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2012. doi:10.1007/978-3-642-33460-3_15Markdown
[Bootkrajang and Kabán. "Label-Noise Robust Logistic Regression and Its Applications." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2012.](https://mlanthology.org/ecmlpkdd/2012/bootkrajang2012ecmlpkdd-labelnoise/) doi:10.1007/978-3-642-33460-3_15BibTeX
@inproceedings{bootkrajang2012ecmlpkdd-labelnoise,
title = {{Label-Noise Robust Logistic Regression and Its Applications}},
author = {Bootkrajang, Jakramate and Kabán, Ata},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2012},
pages = {143-158},
doi = {10.1007/978-3-642-33460-3_15},
url = {https://mlanthology.org/ecmlpkdd/2012/bootkrajang2012ecmlpkdd-labelnoise/}
}