Manifold Distance-Based Over-Sampling Technique for Class Imbalance Learning

Abstract

Over-sampling technology for handling the class imbalanced problem generates more minority samples to balance the dataset size of different classes. However, sampling in original data space is ineffective as the data in different classes is overlapped or disjunct. Based on this, a new minority sample is presented in terms of the manifold distance rather than Euclidean distance. The overlapped majority and minority samples apt to distribute in fully disjunct subspaces from the view of manifold learning. Moreover, it can avoid generating samples between the minority data locating far away in manifold space. Experiments on 23 UCI datasets show that the proposed method has the better classification accuracy.

Cite

Text

Yang et al. "Manifold Distance-Based Over-Sampling Technique for Class Imbalance Learning." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.330110071

Markdown

[Yang et al. "Manifold Distance-Based Over-Sampling Technique for Class Imbalance Learning." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/yang2019aaai-manifold/) doi:10.1609/AAAI.V33I01.330110071

BibTeX

@inproceedings{yang2019aaai-manifold,
  title     = {{Manifold Distance-Based Over-Sampling Technique for Class Imbalance Learning}},
  author    = {Yang, Lingkai and Guo, Yi-Nan and Cheng, Jian},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {10071-10072},
  doi       = {10.1609/AAAI.V33I01.330110071},
  url       = {https://mlanthology.org/aaai/2019/yang2019aaai-manifold/}
}