2005
DOI: 10.1007/11538059_91
|View full text |Cite
|
Sign up to set email alerts
|

Borderline-SMOTE: A New Over-Sampling Method in Imbalanced Data Sets Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
1,613
0
22

Year Published

2009
2009
2021
2021

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 2,818 publications
(1,635 citation statements)
references
References 15 publications
0
1,613
0
22
Order By: Relevance
“…(2005) designed the improvement of SMOTE, namely Borderline-SMOTE [10]. The authors divided positive instances into three regions; noise, borderline, and safe, by considering the number of negative instances on k nearest neighbours.…”
Section: Related Workmentioning
confidence: 99%
“…(2005) designed the improvement of SMOTE, namely Borderline-SMOTE [10]. The authors divided positive instances into three regions; noise, borderline, and safe, by considering the number of negative instances on k nearest neighbours.…”
Section: Related Workmentioning
confidence: 99%
“…At the data level, various re-sampling techniques are applied to balance class distribution, including over-sampling minority class instances and under-sampling majority class instances [5], [6], [7], [8] . Particularly, SMOTE (Synthetic Minority Over-sampling Technique) [1] is a popular approach designed for generating new minority class data, which could expand decision boundary towards majority class.…”
Section: Introduction I Mbalanced Data Sets (Ids) Correspond To Domentioning
confidence: 99%
“…From these three general topics in class imbalance, data level methods are the most investigated. These methods consist of balancing the original data set, either by over-sampling the minority class [8][9][10][11] and/or by under-sampling the majority class [12][13][14], until the problem classes are approximately equally represented.…”
Section: Introductionmentioning
confidence: 99%