2023
DOI: 10.1016/j.ins.2022.11.156
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised feature selection through combining graph learning and 2,0-norm constraint

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(8 citation statements)
references
References 41 publications
0
8
0
Order By: Relevance
“…Subsequently, numerous graph construction methods based on data correlation have been presented, including L 1 graph [27], low-rank representation (LRR) [28], local structure learning [29], and sparse subspace clustering (SSC) [30], to construct high-quality graphs. The above-mentioned graph construction methods are integrated into FS models, proposing a large number of improvements for feature selection [31][32][33][34][35][36][37]. However, the processes of adaptive graph construction and FS in the above-mentioned methods are independent of each other, so the influence of graph construction on the FS process is limited.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Subsequently, numerous graph construction methods based on data correlation have been presented, including L 1 graph [27], low-rank representation (LRR) [28], local structure learning [29], and sparse subspace clustering (SSC) [30], to construct high-quality graphs. The above-mentioned graph construction methods are integrated into FS models, proposing a large number of improvements for feature selection [31][32][33][34][35][36][37]. However, the processes of adaptive graph construction and FS in the above-mentioned methods are independent of each other, so the influence of graph construction on the FS process is limited.…”
Section: Introductionmentioning
confidence: 99%
“…construct high-quality graphs. The above-mentioned graph construction method integrated into FS models, proposing a large number of improvements for feature tion [31][32][33][34][35][36][37]. However, the processes of adaptive graph construction and FS above-mentioned methods are independent of each other, so the influence of grap struction on the FS process is limited.…”
Section: Introductionmentioning
confidence: 99%
“…It is intuitive to ask the question: Whether it is also effective to attack the features of middle layers? Various scholars devote their efforts to tackling this question [39,40]. Zhou et al presented the Transferable Adversarial Perturbations (TAP) to improve the transferability through increasing the feature distances between the original image and its adversarial example in the intermediate layers [41].…”
Section: Related Workmentioning
confidence: 99%
“…A graph-theoretic method based on a two-step procedure that combines filter and wrapper methods was proposed in [13] to classify micro-array data. Beyond single-label learning feature selection has also been applied in other contexts such as multi-label, multi-view, unsupervised, and label distribution learning [31] , [40] , [53] , [55] .…”
Section: Literaturementioning
confidence: 99%