2016
DOI: 10.1016/j.neucom.2015.07.011
|View full text |Cite
|
Sign up to set email alerts
|

Linear dimensionality reduction based on Hybrid structure preserving projections

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
9
1

Relationship

3
7

Authors

Journals

citations
Cited by 26 publications
(18 citation statements)
references
References 47 publications
0
18
0
Order By: Relevance
“…KLDA algorithm maps the input vectors to a higher dimensional feature space F via the nonlinear mapping function ∅, and then it executes the linear discriminant analysis in the high dimensional feature space [ 40 ]. To increase further understanding, the KLDA algorithm will be described in detail in the S1 File , in which, these two literatures [ 41 , 42 ] will be cited.…”
Section: Methodsmentioning
confidence: 99%
“…KLDA algorithm maps the input vectors to a higher dimensional feature space F via the nonlinear mapping function ∅, and then it executes the linear discriminant analysis in the high dimensional feature space [ 40 ]. To increase further understanding, the KLDA algorithm will be described in detail in the S1 File , in which, these two literatures [ 41 , 42 ] will be cited.…”
Section: Methodsmentioning
confidence: 99%
“…For the focus of this paper, we do not give too many descriptions for the derivation and calculation process of matrix W* . According to [ 31 , 32 ], for multi-class pattern classification, such as C classification problem, the orthonormal columns of W * must satisfy Equation (13), which is a generalized eigenvalue problem.…”
Section: Dimension Reduction Methods and Classifier Algorithmmentioning
confidence: 99%
“…(2) Regression problem. By taking the grade as the response variable, SGP is rewritten into assigning scores following the features of student or course, such as linear regression [5,11,12], neural network [13][14][15] and random forest [9]. (3) Matrix completion.…”
Section: Introductionmentioning
confidence: 99%