2003
DOI: 10.1002/cem.785
|View full text |Cite
|
Sign up to set email alerts
|

Partial least squares for discrimination

Abstract: Partial least squares (PLS) was not originally designed as a tool for statistical discrimination. In spite of this, applied scientists routinely use PLS for classification and there is substantial empirical evidence to suggest that it performs well in that role. The interesting question is: why can a procedure that is principally designed for overdetermined regression problems locate and emphasize group structure? Using PLS in this manner has heurestic support owing to the relationship between PLS and canonica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

6
1,590
0
32

Year Published

2005
2005
2014
2014

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 2,423 publications
(1,628 citation statements)
references
References 22 publications
6
1,590
0
32
Order By: Relevance
“…The application of principal component analysis showed a sufficient separation between the groups in the first three dimensions, displaying the second and third components (Supplementary Figure S1A). In order to extrapolate the lists of mass signals causing the possible separation between the two classes (C57J vs C57N), an orthogonal partial least squares discriminant analysis (OPLS/O2PLS-DA) has been performed (Trygg, 2002;Trygg and Wold, 2002;Barker and Rayens, 2003;Bylesjö et al, 2006). The robustness of the model has been tested with cross-validation analysis of variance (P ¼ 0.000275).…”
Section: Generalmentioning
confidence: 99%
“…The application of principal component analysis showed a sufficient separation between the groups in the first three dimensions, displaying the second and third components (Supplementary Figure S1A). In order to extrapolate the lists of mass signals causing the possible separation between the two classes (C57J vs C57N), an orthogonal partial least squares discriminant analysis (OPLS/O2PLS-DA) has been performed (Trygg, 2002;Trygg and Wold, 2002;Barker and Rayens, 2003;Bylesjö et al, 2006). The robustness of the model has been tested with cross-validation analysis of variance (P ¼ 0.000275).…”
Section: Generalmentioning
confidence: 99%
“…cm1 is drawn from a NASA spacecraft instrument project, pc1 is from a flight In order to investigate the performance of AKPLSC (Gaussian kernel κ(x, y) = exp(−||x − y|| 2 ) is used here), we compare it with random undersampling (RUS) [4], AdaBoost [4], Partial Least Squares Classifier (PLSC) [7], APLSC [6], and KPLSC [8]. For each data set, we perform a 10 × 5-fold cross validation.…”
Section: Resultsmentioning
confidence: 99%
“…Linear Partial Least Squares (PLS) [7] is an effective linear transformation, which performs the regression on the subset of extracted latent variables. Kernel PLS [8] first performs nonlinear mapping Φ : {x i } n i=1 ∈ R N → Φ(x) ∈ F to project an input vector to a higher dimensional feature space.…”
Section: Asymmetric Kernel Partial Least Squares Classifier For Softwmentioning
confidence: 99%
“…Principal components analysis and partial least squares discriminant analysis were utilized as a modeling methods for clustering and discrimination. 33 Random subsets cross-validation method and Q 2 scores were used to develop the models. The variable importance in the projection values were calculated to identify the most important molecular species for the clustering of specific groups.…”
Section: Discussionmentioning
confidence: 99%