2022
DOI: 10.1016/j.eswa.2021.115845
|View full text |Cite
|
Sign up to set email alerts
|

Joint sparse principal component regression with robust property

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…They propose to combine the previous projection learning models with linear regression into one objective function and develop one-stage methods. The typical ones include sparse principal component regression (SPCR) [31], joint SPCR (JSPCR) [32], orthogonal autoencoder regression (OAR) [33] etc. To simultaneously take the local geometrical structure and discriminative information into account, Lai et al [34] imposed the L 2,1 -norm on the projection and LPP term to extract robust and discriminative features and proposed a generalised robust regression (GRR) method.…”
Section: Introductionmentioning
confidence: 99%
“…They propose to combine the previous projection learning models with linear regression into one objective function and develop one-stage methods. The typical ones include sparse principal component regression (SPCR) [31], joint SPCR (JSPCR) [32], orthogonal autoencoder regression (OAR) [33] etc. To simultaneously take the local geometrical structure and discriminative information into account, Lai et al [34] imposed the L 2,1 -norm on the projection and LPP term to extract robust and discriminative features and proposed a generalised robust regression (GRR) method.…”
Section: Introductionmentioning
confidence: 99%
“…It involves applying robust principal component analysis (PCA) to the high-dimensional data of the regressor, followed by employing a robust regression method to regress the response variables on the obtained scores. [12][13][14][15] Among the robust regression methods, M-estimation is commonly used, which employs iterative weighted least squares to estimate the regression coefficients. 16 The weights assigned to each sample are determined based on the residuals from the previous regression.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, Kawano proposed another method 26 based on the singular value decomposition for of the loss function of PCA in SPCR. Qi et al proposed JSPCR, 27 an extension of SPCR, to be applied to the outcomes of an outlier. A method for using multi‐block data related to SPCR has also been proposed 28‐30 .…”
Section: Introductionmentioning
confidence: 99%