2015 IEEE International WIE Conference on Electrical and Computer Engineering (WIECON-ECE) 2015
DOI: 10.1109/wiecon-ece.2015.7444011
|View full text |Cite
|
Sign up to set email alerts
|

Selecting best attributes for software defect prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 16 publications
0
8
0
Order By: Relevance
“…Mandal and Ami were the ones who initially provided the attribute selection for the SDP model [12]. The findings of the experiment indicate that the concept that was provided offers a collection of qualities that is nearly as excellent as the ideal set, which contributes to the improvement in the performance of the defect prediction model.…”
Section: Feature Selection Techniques For Software Fault Predictionmentioning
confidence: 99%
“…Mandal and Ami were the ones who initially provided the attribute selection for the SDP model [12]. The findings of the experiment indicate that the concept that was provided offers a collection of qualities that is nearly as excellent as the ideal set, which contributes to the improvement in the performance of the defect prediction model.…”
Section: Feature Selection Techniques For Software Fault Predictionmentioning
confidence: 99%
“…In [4], the researchers introduced a comprehensive attribute selection process consisting of five consecutive steps. First, they calculated the balance of each attribute using a base classifier and then ranked the attributes in a list based on their respective balance values.…”
Section: A Rq1: Which Feature Selection Methods Are Implemented For S...mentioning
confidence: 99%
“…Researchers proposed several frameworks, in- Proposed an FS algorithm based on feature combinations, sorting, and selecting the best feature set [2] Proposed an FS framework based on feature clustering and feature ranking called FECAR [38] Proposed an FS method named relief-LC algorithm by seamlessly combining relief FS and Linear Correlation analysis 2015 [24] Proposed an average probability ensemble (APE) method using greedy forward selection (GFS) and correlation-based FS methods [3] Proposed a clustering-based FS method using maximal information coefficient [7] Explored ten filter/wrapper-based FS methods on sixteen open-source projects. [4] Proposed a comprehensive FS method calculating the balance, frequency and weight of each attribute 2016 [10] Propose a sequential FS method by calculating and iteratively optimizing the balance of each attribute [5] Proposed an optimized FS method using Correlation and ANOVA [4] Proposed MICHAC (Maximal Information Coefficient with Hierarchical Agglomerative Clustering) framework, using MIC and Hierarchical Agglomerative Clustering (HAC) methods for FS [53] Proposed a framework using chisquaredattributeeval and correlation attribute eval along with best first, greedy stepwise, and Hierarchical Agglomerative Clustering (HAC) [3] Proposed FS method through General Linear Model (GLM) regression to assess variable importance (VI) for each feature 2017 [41] Proposed an FS method using density-based clustering of hybrid data and ranking strategies [42] Proposed a method for FS using bat-based search along with Correlation-based Feature Selection (CFS) [40] Integrated the SMOTE sampling technique with the three feature selection algorithms, including Chi-Square (CS), Information Gain (IG), and Relief (RLF) [29] Proposed a multi-objective feature selection method based on optimization principles [28] Proposed a FS method based on a similarity measure (SM) [27] Designed a parallel hybrid framework for FS that combined both filter and wrapper methods [8] Proposed a hybrid method for FS using filter and wrapper methods [37] Proposed FESCH (Feature Selection Using Clusters of Hybrid-Data) for Cross-Project Defect Prediction (CPDP) [60] Proposed correlation-based FS method 2018 [44] Proposed a hybrid FS method using wrapper and filter methods by implementing a fast correlation-based filter method 2019 [45] Proposed an FS method using three wrapper algorithms, including Binary Genetic Algorithm (BGA), Binary Particle Swarm Optimiza...…”
Section: ) Embedded Methodmentioning
confidence: 99%
See 1 more Smart Citation
“…On the other hand, wrapper methods locate the most predictive feature subset with the use of search algorithms. It is expected that relevant feature subsets may produce a better prediction ability compared to the features alone [5]. In this study, we evaluate filtering based feature selection algorithms to obtain an effective feature subset.…”
Section: Introductionmentioning
confidence: 99%