2012
DOI: 10.1109/tnnls.2012.2208269
|View full text |Cite
|
Sign up to set email alerts
|

A Discrimination Analysis for Unsupervised Feature Selection via Optic Diffraction Principle

Abstract: This paper proposes an unsupervised discrimination analysis for feature selection based on a property of the Fourier transform of the probability density distribution. Each feature is evaluated on the basis of a simple observation motivated by the concept of optical diffraction, which is invariant under feature scaling. The time complexity is O(mn), where m is number of features and n is number of instances when being applied directly to the given data. This approach is also extended to deal with data orientat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(8 citation statements)
references
References 26 publications
0
8
0
Order By: Relevance
“…Eight "noisy" features [sampled from N (0, 1)] are then appended to this data, resulting in a 10-D patterns. 800 data points are generated, and a set of outliers uniformly sampled from [− 10 30] 10 are added to the data set. Various percentages of outliers are added to the main data sets to test the performance of the algorithm on outlier detection.…”
Section: A Synthetic Datamentioning
confidence: 99%
See 2 more Smart Citations
“…Eight "noisy" features [sampled from N (0, 1)] are then appended to this data, resulting in a 10-D patterns. 800 data points are generated, and a set of outliers uniformly sampled from [− 10 30] 10 are added to the data set. Various percentages of outliers are added to the main data sets to test the performance of the algorithm on outlier detection.…”
Section: A Synthetic Datamentioning
confidence: 99%
“…Existing feature selection algorithm can be categorized as supervised feature selection (on data with full class labels) [5]- [9], unsupervised feature selection (on data without class labels) [10]- [15], and semisupervised feature selection (on data with partial labels) [14], [16], [17]. Feature selection in unsupervised context is considered to be more difficult than the other two cases, since there is no target information available for training.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The goal of supervised feature selection is to find the most discriminative features that can distinguish different classes. Thus discriminant analysis plays an important role in supervised feature selection [3], [20], [21]. The Fisher Score algorithm [10] is a widely applied filter-type feature selection algorithm based on linear discriminant analysis (LDA).…”
Section: Introductionmentioning
confidence: 99%
“…Mitra et al (2004) proposed an FS technique based on Feature Similarity (FSFS). Padungweang et al (2012) presented a new approach for FS, using the optical diffraction principle. Recently a non-parametric Bayesian error minimization scheme for FS was proposed by Yang and Hu (2012).…”
Section: Introductionmentioning
confidence: 99%