2015
DOI: 10.5120/20462-2829
|View full text |Cite
|
Sign up to set email alerts
|

Literature Review of Feature Selection for Mining Tasks

Abstract: During past few decades, researchers worked on data preprocessing techniques for the datasets. Data preprocessing techniques are needed, where the data are prepared for mining. The performance of data mining algorithms in most cases depends on dataset quality, since low-quality training data may lead to the construction of overfitting or fragile classifiers. Also, scientists worked on data mining areas in both algorithms section and conceptions practice section. But for better results they always used the comb… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 15 publications
0
2
0
Order By: Relevance
“…IG is used to indicate feature relevance in this setting [15]. The formula for Information Gain (IG) is defined as equation ( 2) [16]:…”
Section: Information Gain Ratiomentioning
confidence: 99%
“…IG is used to indicate feature relevance in this setting [15]. The formula for Information Gain (IG) is defined as equation ( 2) [16]:…”
Section: Information Gain Ratiomentioning
confidence: 99%
“…In this approach ranking method is used by which all the attributes are ranked as per there relevance to the classification tasks, selection of attributes depends on the rank of the attribute. This approach is classifier independent as only one-time feature selection is performed which will selects the features that can be tested with different classifiers (Mwadulo, 2016;Pervez & Farid, 2015;Lu et.al., 2012).…”
Section: Feature Selectionmentioning
confidence: 99%