1990
DOI: 10.1111/j.2517-6161.1990.tb01786.x
|View full text |Cite
|
Sign up to set email alerts
|

Continuum Regression: Cross-Validated Sequentially Constructed Prediction Embracing Ordinary Least Squares, Partial Least Squares and Principal Components Regression

Abstract: SUMMARY The paper addresses the evergreen problem of construction of regressors for use in least squares multiple regression. In the context of a general sequential procedure for doing this, it is shown that, with a particular objective criterion for the construction, the procedures of ordinary least squares and principal components regression occupy the opposite ends of a continuous spectrum, with partial least squares lying in between. There are two adjustable ‘parameters’ controlling the procedure: ‘alpha’,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
207
0
2

Year Published

1995
1995
2006
2006

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 371 publications
(211 citation statements)
references
References 20 publications
2
207
0
2
Order By: Relevance
“…Because of this, it can be expected that, when taking the product of R X 2 and R Y 2 , no weightings of R X 2 and R Y 2 are needed to avoid that either R X 2 or R Y 2 becomes too small. This is not to say that it is impossible to implement such weights: Indeed, in a similar way as in continuum regression (Stone and Brooks 1990), we could take different powers of the terms in the product. So a general formulation of the criterion could be (see De Jong and Kiers 1992) …”
Section: Power Regressionmentioning
confidence: 97%
See 1 more Smart Citation
“…Because of this, it can be expected that, when taking the product of R X 2 and R Y 2 , no weightings of R X 2 and R Y 2 are needed to avoid that either R X 2 or R Y 2 becomes too small. This is not to say that it is impossible to implement such weights: Indeed, in a similar way as in continuum regression (Stone and Brooks 1990), we could take different powers of the terms in the product. So a general formulation of the criterion could be (see De Jong and Kiers 1992) …”
Section: Power Regressionmentioning
confidence: 97%
“…For this reason, various alternatives to regression, sometimes called biased regression techniques, have been proposed. The best known of these are ridge regression (Hoerl and Kennard 1970), partial least squares (PLS; e.g., see Wold et al 1984;Martens and Naes 1989) and principal component regression (PCR; see Coxe 1986;Martens and Naes 1989), but several other methods have also been developed (e.g., ILS by Frank 1987; continuum regression by Stone and Brooks 1990, see also De Jong et al 2001 andStone 1994; the Curds and Whey procedure by Breiman and Friedman 1997, and various recent, more specialized procedures, e.g. see Esposito Vinzi et al 2001).…”
Section: Introductionmentioning
confidence: 99%
“…The PLS multivariate regression (PLS2) can be seen either under an algorithmic point of view (NIPALS) or it can be connected to classical multivariate theory according to Wold (Wold et al 1983) and Stone andBrooks (Höskuldsson 1988, Stone andBrooks 1990). These works show how parameter estimation in PLS2, and consequentially the extraction of the A relevant components, can be solved through classical eigen-problems, by a single SVD of the cross product matrix X Y or by repeating A times the SVD of the cross-product of the deflated matrix X a−1 Y , with a = 1, .…”
Section: Pls Regression For L-structured Data (L-plsr)mentioning
confidence: 99%
“…34,45 In fact, PLSR can be viewed as a compromise midway between PCR and MLR. 49 In determining the decomposition of R (and consequently removing unwanted random variance), PCR is not influenced by knowledge of the estimated property in the calibration set, c. Only the variance in R is employed to determine the latent variables. Conversely, MLR does not factor R prior to regression; all variance correlated with c is employed for estimation.…”
Section: Partial Least-squares Regressionmentioning
confidence: 99%