2017 IEEE International Conference on Data Mining Workshops (ICDMW) 2017
DOI: 10.1109/icdmw.2017.153
|View full text |Cite
|
Sign up to set email alerts
|

An Empirical Evaluation of Techniques for Feature Selection with Cost

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 17 publications
0
5
0
Order By: Relevance
“…It uses a separating hyperplane to distinguish between the two classes. The problem is limited to minimize the dimensions in the hyperplane, and as a result, feature selection takes place [16].…”
Section: Feature Selection Via Concave Minimization (Fsvfs)mentioning
confidence: 99%
See 2 more Smart Citations
“…It uses a separating hyperplane to distinguish between the two classes. The problem is limited to minimize the dimensions in the hyperplane, and as a result, feature selection takes place [16].…”
Section: Feature Selection Via Concave Minimization (Fsvfs)mentioning
confidence: 99%
“…The Laplacian method, which is an unsupervised method, evaluates the importance of features by considering that the samples assigned to a particular group are at a shorter distance from each other than the samples in the other group. Thus, the method creates the nearest neighbour graph and ranks the features consistent with the Gauss Laplacian matrix according to their importance [16].…”
Section: Laplacian Score (Ls)mentioning
confidence: 99%
See 1 more Smart Citation
“…CM applications could be characterized by several data sources, leading to large datasets. Although having a lot of data could generate better results; a greater amount of data will result in a higher impact of the curse of dimensionality [36]. Consequently, selecting a subset of relevant features or PVs is crucial to improve the subsequent calculation steps.…”
Section: Introductionmentioning
confidence: 99%
“…Generally, feature selection methods optimize the feature set based on some notion of performance. Costsensitive feature selection allows for the cost of each feature to be incorporated into the feature selection process [2,4,10,20,34,35,38,52].…”
Section: Introductionmentioning
confidence: 99%