Dealing with Multiple Classes in Online Class Imbalance Learning

Abstract

Online class imbalance learning deals with data streams having very skewed class distributions in a timely fashion. Although a few methods have been proposed to handle such problems, most of them focus on two-class cases. Multi-class imbalance imposes additional challenges in learning. This paper studies the combined challenges posed by multi-class imbalance and online learning, and aims at a more effective and adaptive solution. First, we introduce two resampling-based ensemble methods, called MOOB and MUOB, which can process multi-class data directly and strictly online with an adaptive sampling rate. Then, we look into the impact of multi-minority and multi-majority cases on MOOB and MUOB in comparison to other methods under stationary and dynamic scenarios. Both multi-minority and multi-majority make a negative impact. MOOB shows the best and most stable G-mean in most stationary and dynamic cases. PDF

Cite

Text

Wang et al. "Dealing with Multiple Classes in Online Class Imbalance Learning." International Joint Conference on Artificial Intelligence, 2016.

Markdown

[Wang et al. "Dealing with Multiple Classes in Online Class Imbalance Learning." International Joint Conference on Artificial Intelligence, 2016.](https://mlanthology.org/ijcai/2016/wang2016ijcai-dealing/)

BibTeX

@inproceedings{wang2016ijcai-dealing,
  title     = {{Dealing with Multiple Classes in Online Class Imbalance Learning}},
  author    = {Wang, Shuo and Minku, Leandro L. and Yao, Xin},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2016},
  pages     = {2118-2124},
  url       = {https://mlanthology.org/ijcai/2016/wang2016ijcai-dealing/}
}