2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence) 2008
DOI: 10.1109/ijcnn.2008.4633969
|View full text |Cite
|
Sign up to set email alerts
|

ADASYN: Adaptive synthetic sampling approach for imbalanced learning

Abstract: Abstract-This paper presents a novel adaptive synthetic (ADASYN) sampling approach for learning from imbalanced data sets. The essential idea of ADASYN is to use a weighted distribution for different minority class examples according to their level of difficulty in learning, where more synthetic data is generated for minority class examples that are harder to learn compared to those minority examples that are easier to learn. As a result, the ADASYN approach improves learning with respect to the data distribut… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
726
0
16

Year Published

2016
2016
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 1,570 publications
(742 citation statements)
references
References 15 publications
0
726
0
16
Order By: Relevance
“…Adaptive Synthetic Over-Sampling (ADASYN) is considered as an extension of SMOTE, which is characterized by the creation of more samples in the vicinity of the boundary in the midst of the two classes than inside the minority class [62].…”
Section: Over-sampling Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Adaptive Synthetic Over-Sampling (ADASYN) is considered as an extension of SMOTE, which is characterized by the creation of more samples in the vicinity of the boundary in the midst of the two classes than inside the minority class [62].…”
Section: Over-sampling Methodsmentioning
confidence: 99%
“…Thus, more "intelligent" or heuristic sampling methods have been developed; for example, SMOTE [61] and ADASYN [62]. Moreover, they have been combined with editing methods, such as Tomek's Links (TL) [26], Editing Nearest Neighbor (ENN) [27], Condensed Nearest Neighbor rule (CNN) [28] and others [22,[29][30][31][32][33]63]; i.e., hybrid methods, such as SMOTE+TL, SMOTE+CNN, SMOTE+OSS and SMOTE+ENN [22,64,65].…”
Section: Sampling Class Imbalance Approachesmentioning
confidence: 99%
“…SMOTE encounters some drawbacks including over-generalization and lack of systematizing disjuncts. Enhanced techniques such as Borderline-SMOTE [20], SafeLevel-SMOTE [21] and adaptive synthetic sampling (ADASYN) [22] help to overcome these drawbacks. The proposed technique follows the same baseline while leveraging the disjuncts and generalization issue.…”
Section: Related Workmentioning
confidence: 99%
“…Due to imbalance of data in the training set, Adaptive Synthesis(ADASYN) [7] is applied to generate synthetic data for training. Parameters of ADASYN are: beta=1, k=5, which means after ADASYN, we will have the same amount of data labelled as 1 and -1.…”
Section: Dealing With Imbalanced Datamentioning
confidence: 99%