2006
DOI: 10.1002/cem.1025
|View full text |Cite
|
Sign up to set email alerts
|

Impartial graphical comparison of multivariate calibration methods and the harmony/parsimony tradeoff

Abstract: For multivariate calibration with the relationship y ¼ Xb, it is often necessary to determine the degrees of freedom for parsimony consideration and for the error measure root mean square error of calibration (RMSEC). This paper shows that degrees of freedom can be estimated by an effective rank (ER) measure to estimate the model fitting degrees of freedom and the more parsimonious model has the smallest ER. This paper also shows that when such a measure is used on the X-axis, simultaneous graphing of model er… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
19
0

Year Published

2008
2008
2015
2015

Publication Types

Select...
9

Relationship

4
5

Authors

Journals

citations
Cited by 25 publications
(19 citation statements)
references
References 49 publications
0
19
0
Order By: Relevance
“…The procedure based on sum of ranking differences (SRD) agrees well with multi-criteria decision making as it provides very similar rankings for models if the performance merits are used for SRD-based comparison (particularly when considering good models, as in Case study 1).…”
Section: Resultsmentioning
confidence: 87%
See 1 more Smart Citation
“…The procedure based on sum of ranking differences (SRD) agrees well with multi-criteria decision making as it provides very similar rankings for models if the performance merits are used for SRD-based comparison (particularly when considering good models, as in Case study 1).…”
Section: Resultsmentioning
confidence: 87%
“…A biased model provides less variance and vice versa. However, harmonious models are not necessarily parsimonious [1]. The scope of the methodology has recently been extended with the idea of sum of ranking differences (SRD) for partial least squares and ridge regression models [2].…”
mentioning
confidence: 99%
“…The minimization expression for the TR variant RR [24,[42][43][44] The L-curve for selecting tuning parameters [3,20,21,[24][25][26][27]29] can be formed by plotting mean RMSEC or RMSECV against a model variance or complexity measure. Models in the corner region of the L-curve represent acceptable compromises for the bias/variance tradeoff,…”
Section: Rrmentioning
confidence: 99%
“…Sensitivity to instrumental noise was taken into account as suggested elsewhere [20][21][22] by calculating the 2-norm of the regression vector (||b av ||), which is defined as: (12) where b k av denotes the regression coefficient associated to variable x k in the ensemble model. In fact, by following the demonstration provided by Pinto et al, 20 it can be shown that s ŷ av = s noise ||b av ||, where s noise is the standard deviation of the instrumental noise (assumed to be homoscedastic and uncorrelated across the model variables) and s ŷ av is the standard deviation of the error in the ensemble model predictions resulting from the propagation of the instrumental noise.…”
Section: Evaluation Of the Mlr-spa-subagging Modelsmentioning
confidence: 99%