Perceptron Based Learning with Example Dependent and Noisy Costs
Abstract
Learning algorithms from the fields of artificial neural networks and machine learning, typically, do not take any costs into account or allow only costs depending on the classes of the examples that are used for learning. As an extension of class dependent costs, we consider costs that are example, i.e. feature and class dependent. We derive a costsensitive perceptron learning rule for nonseparable classes, that can be extended to multi-modal classes (DIPOL). We also derive aa approach for including example dependent costs into an arbitrary cost-insensitive learning algorithm by sampling according to roodified probability distributions. ICML Proceedings of the Twentieth International Conference on Machine Learning
Cite
Text
Geibel and Wysotzki. "Perceptron Based Learning with Example Dependent and Noisy Costs." International Conference on Machine Learning, 2003.Markdown
[Geibel and Wysotzki. "Perceptron Based Learning with Example Dependent and Noisy Costs." International Conference on Machine Learning, 2003.](https://mlanthology.org/icml/2003/geibel2003icml-perceptron/)BibTeX
@inproceedings{geibel2003icml-perceptron,
title = {{Perceptron Based Learning with Example Dependent and Noisy Costs}},
author = {Geibel, Peter and Wysotzki, Fritz},
booktitle = {International Conference on Machine Learning},
year = {2003},
pages = {218-225},
url = {https://mlanthology.org/icml/2003/geibel2003icml-perceptron/}
}