2015
DOI: 10.1016/j.neunet.2015.06.001
|View full text |Cite
|
Sign up to set email alerts
|

Locality preserving score for joint feature weights learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…SPFW is similar as our latest work [28] because both have the same feature search strategy, which learns feature weights jointly in an unsupervised way. However, their feature measurements are different.…”
Section: Basic Idea and Formulationmentioning
confidence: 99%
See 1 more Smart Citation
“…SPFW is similar as our latest work [28] because both have the same feature search strategy, which learns feature weights jointly in an unsupervised way. However, their feature measurements are different.…”
Section: Basic Idea and Formulationmentioning
confidence: 99%
“…However, their feature measurements are different. SPFW introduces sparse representation into feature selection; [ 28] combines feature selection and adaptive neighbors preservation into a single framework.…”
Section: Basic Idea and Formulationmentioning
confidence: 99%
“…Unsupervised feature selection fails to extract more discriminative features which may yield worse performance. Semisupervised feature selection focuses on maximizing data effectiveness by using labeled and unlabeled data together [7]. In this case, the amount of unlabeled data is much larger than that of labeled data.…”
Section: Introductionmentioning
confidence: 99%
“…We emphasize that though we have not treated trained networks directly, our techniques apply to trained networks that are close to GPs. Shown in [9,10,35], randomly initialized infinite width NNs are GPs, and further, the GP property persists under appropriate training [9,28,29,35,36]. Such trained networks at finite width should be effectively described using our techniques, which only requires being close to a GP.…”
Section: Related Workmentioning
confidence: 99%