2010
DOI: 10.5121/ijaia.2010.1405
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Level Dimensionality Reduction Methods Using Feature Selection and Feature Extraction

Abstract: This paper presents a novel feature selection method called Feature Quality (FQ) measure based on the quality measure of individual features. We also propose novel combinations of two level and multi level dimensionality reduction methods which are based on the feature selection like mutual correlation, FQ measure and feature extraction methods like PCA(Principal Component Analysis)/LPP(Locality Preserving Projection). These multi level dimensionality reduction methods integrate feature selection and feature e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
7
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 26 publications
2
7
0
Order By: Relevance
“…Filter methods have a low computational cost and are independent of the learning method [6,48,49]. However, they lack robustness against relationships among elements and element redundancy [48,49], and it is unclear how to choose the cut-off point for rankings to determine only dominant features [6].…”
Section: Overview Of Feature Selection (Fs) Extraction (Fx) and Combi...mentioning
confidence: 99%
See 1 more Smart Citation
“…Filter methods have a low computational cost and are independent of the learning method [6,48,49]. However, they lack robustness against relationships among elements and element redundancy [48,49], and it is unclear how to choose the cut-off point for rankings to determine only dominant features [6].…”
Section: Overview Of Feature Selection (Fs) Extraction (Fx) and Combi...mentioning
confidence: 99%
“…Filter methods have a low computational cost and are independent of the learning method [6,48,49]. However, they lack robustness against relationships among elements and element redundancy [48,49], and it is unclear how to choose the cut-off point for rankings to determine only dominant features [6]. Generally, wrapper techniques outperform filter techniques [6,48,49] as they consider the feature dependencies and their collective contribution to model generation [6].…”
Section: Overview Of Feature Selection (Fs) Extraction (Fx) and Combi...mentioning
confidence: 99%
“…The advantage of the wrapper methods is that they work together with the specific classification algorithm and account for the synergy of the joint usage of selected features. The disadvantages of the wrapper methods are the higher risk of overtraining and long time required to calculate classification accuracy [34].…”
Section: Feature Selectionmentioning
confidence: 99%
“…According to Pradnya and Manisha, 90% of the world's data is unstructured (textual data) and there is necessity for intelligent text analysis on the data [1]. The major challenge of intelligent text-based system is accuracy of the system and high dimensionality of feature space [2]. It is very important to use feature selection model to capture these challenges, by reducing high dimensionality of data for effective text-based system.…”
Section: Introductionmentioning
confidence: 99%