2001
DOI: 10.1016/s0169-7439(01)00119-8
|View full text |Cite
|
Sign up to set email alerts
|

The successive projections algorithm for variable selection in spectroscopic multicomponent analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
539
0
13

Year Published

2006
2006
2020
2020

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 1,093 publications
(595 citation statements)
references
References 23 publications
4
539
0
13
Order By: Relevance
“…The multi-spectral device can only contain few spectral bands, so feature selection us been used, which means locating the best minimum subset of the original features. This was done using the Successive Projection Algorithm (SPA) (Araujo et al, 2001), first, the instrumental response data are used to create chains of variables according to a sequence of vector projection operations designed to minimize multi-collinearity among the variables of the chain. Second, the algorithm creates a model for each of the candidate subsets of variables extracted from the chains generated.…”
Section: Spectral Data Analysis For Disease Detectionmentioning
confidence: 99%
“…The multi-spectral device can only contain few spectral bands, so feature selection us been used, which means locating the best minimum subset of the original features. This was done using the Successive Projection Algorithm (SPA) (Araujo et al, 2001), first, the instrumental response data are used to create chains of variables according to a sequence of vector projection operations designed to minimize multi-collinearity among the variables of the chain. Second, the algorithm creates a model for each of the candidate subsets of variables extracted from the chains generated.…”
Section: Spectral Data Analysis For Disease Detectionmentioning
confidence: 99%
“…Each chain starts with one of the variables under consideration and is successively augmented with additional variables chosen in order to display the least collinearity with the previous ones, as described in earlier papers. 14,17 The notation {SEL(1, k), SEL(2, k), …, SEL(M, k)} is used to denote the index set of variables belonging to the chain initialized with x k (that is, SEL(1, k) = k).…”
Section: Robust Variable Selection By Spamentioning
confidence: 99%
“…13 This algorithm was originally proposed for the minimization of collinearity problems in Multiple Linear Regression (MLR). 14 Honorato et al 13 adapted SPA to select variables that convey information concerning the property of interest and are robust with respect to the differences between two instruments. The proposed strategy was applied to two calibration transfer problems involving the determination of T90% in gasoline by FT-IR spectrometry and moisture in corn by Near Infrared (NIR) spectrometry.…”
Section: Introductionmentioning
confidence: 99%
“…More details concerning the operations involved in SPA can be found elsewhere. 1,9 In all previous applications of SPA, the performance metric employed in Phases 2 and 3 was the RMSEV value obtained in an independent validation set of N val samples, defined as (1) where y val,n and y^v al,n are the reference and predicted values of the parameter under consideration for the n th validation sample. In the present work, an extension of this criterion to the cross-validation case is adopted by considering the root mean square error of cross-validation (RMSECV) defined as (2) where y cal,n is the reference value of the parameter under consideration for the n th sample of the calibration set itself, which contains N cal samples.…”
Section: Background and Theorymentioning
confidence: 99%
“…[1][2][3][4][5][6] To address this issue, the present paper presents a comparative study between the use of a separate validation set and leaveone-out cross-validation for the selection of spectral variables by SPA. This investigation is of value to determine whether there are gains, in either parsimony or prediction performance, that may justify the use of crossvalidation in SPA in view of the computational overhead 1581Galvão et al Vol.…”
Section: Introductionmentioning
confidence: 99%