2018
DOI: 10.1109/tse.2017.2731766
|View full text |Cite
|
Sign up to set email alerts
|

MAHAKIL: Diversity Based Oversampling Approach to Alleviate the Class Imbalance Issue in Software Defect Prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
94
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 231 publications
(94 citation statements)
references
References 63 publications
0
94
0
Order By: Relevance
“…It then picks of those rows y i at random and creates a new example at a randomly selected distance between x and y i . Some recent results report that off-the-shelf SMOTE can be improved by some local tuning [3], [12]. We do not use such local tuning since recently is has been shown that such tunings are outperformed by FFTs (see below).…”
Section: Smotementioning
confidence: 99%
“…It then picks of those rows y i at random and creates a new example at a randomly selected distance between x and y i . Some recent results report that off-the-shelf SMOTE can be improved by some local tuning [3], [12]. We do not use such local tuning since recently is has been shown that such tunings are outperformed by FFTs (see below).…”
Section: Smotementioning
confidence: 99%
“…Quite often researchers propose one specific use of a machine learning model in software defect prediction, such as Naï ve Bayes [10], and use the work of Lessmann et al [4] to justify their choice of classifier(s) [53] [38] showed that RF was significantly better than 21 other prediction models." [29], although Lessmann et al do not make any such assertion [4]. This study shows that, although it is still unclear which classifier performs the best, researchers should justify the use and validity of their choice of classifier [48].…”
Section: Comparison With Other Studiesmentioning
confidence: 62%
“…Of the advanced over-sampling methods, some typical or recently proposed examples are the SMOTE (Chawla et al 2002), MWMOTE (majority weighted minority over-sampling) (Barua et al 2014), graph-based over-sampling (Pé rez-Ortiz et al 2015), diversity-based over-sampling (Bennin et al 2018), and generative adversarial network-based over-sampling (Douzas and Bacao 2018) techniques. Among them, the SMOTE is the most established and widely used over-sampling method, which creates synthetic minority samples by interpolating existing minority samples along the line segments joining the k minority class nearest neighbours.…”
Section: Resampling Techniques For Class Imbalancementioning
confidence: 99%