2018
DOI: 10.1109/access.2018.2878855
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Feature Selection With Ordinal Preserving Self-Representation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 40 publications
0
6
0
Order By: Relevance
“…Several methods have been proposed by using the selfrepresentation based reconstruction error function [2], [11], [12], [21], [22], [24]- [26], [38], [39], where each feature is reconstructed be other features linear combination according to…”
Section: B Reconstruction Based Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Several methods have been proposed by using the selfrepresentation based reconstruction error function [2], [11], [12], [21], [22], [24]- [26], [38], [39], where each feature is reconstructed be other features linear combination according to…”
Section: B Reconstruction Based Methodsmentioning
confidence: 99%
“…On the one hand, the original data should be approximated only from representative samples and relevant features to achieve more faithful and interpretable results. However, the selfrepresentation based methods only take the relevant features from all the candidate samples [11], [21], [22], [24]- [26]. While the matrix factorization and dictionary learning methods estimate the mix-signed and less interpretable coefficients or the basis [14], [15], [18], [20], [27].…”
Section: Introductionmentioning
confidence: 99%
“…The method proposed by Fan et al [34] can select more informative features and learn the structure information of data points at the same time. Dai et al [35] proposed method that each feature is represented by linear combination of other features while maintaining the local geometrical structure and the ordinal locality of original data.…”
Section: A Unsupervised Feature Selection Methodsmentioning
confidence: 99%
“…This method uses Euclidean distance to convert to conditional probability, accordingly explaining high-dimensional data through normal distribution and explaining the similarity between points [44], as shown in (13). = , the result can be obtained, and is shown in (14). Finally, the SNE algorithm also applies Kullback-Leibler divergence (KLD) to express the degree of similarity between the two distributions.…”
Section: ) Stochastic Neighbor Embeddingmentioning
confidence: 99%
“…In order to solve these problems, the selection of features becomes more and more critical. Using typical methods such as ReliefF, symmetrical uncertainty (SU) and fast correlation-based filter (FCBF) for feature selection can undoubtedly reduce the complexity of calculation and database size so that the algorithm will not change the original features [14], [15]. Compared with the other categories, feature extraction projects important features to facilitate visual observation and can reorganize subspaces and retain the original space's data structure.…”
Section: Introductionmentioning
confidence: 99%