1994
DOI: 10.1002/cem.1180080206
|View full text |Cite
|
Sign up to set email alerts
|

Generalized rank annihilation method. I: Derivation of eigenvalue problems

Abstract: SUMMARYRank annihilation factor analysis (RAFA) is a method for multicomponent calibration using two data matrices simultaneously, one for the unknown and one for the calibration sample. In its most general form, the generalized rank annihilation method (GRAM), an eigenvalue problem has to be solved. In this first paper different formulations of GRAM are compared and a slightly different eigenvalue problem will be derived. The eigenvectors of this specific eigenvalue problem constitute the transformation matri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
37
0

Year Published

1997
1997
2009
2009

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 59 publications
(38 citation statements)
references
References 18 publications
1
37
0
Order By: Relevance
“…Equations (6) and (7) are consistent with the parametrization favoured by Faber et al [11]. This parametrization can be made unique by imposing constraints on X and Y.…”
Section: Generalized Rank Annihilation Methodssupporting
confidence: 66%
See 1 more Smart Citation
“…Equations (6) and (7) are consistent with the parametrization favoured by Faber et al [11]. This parametrization can be made unique by imposing constraints on X and Y.…”
Section: Generalized Rank Annihilation Methodssupporting
confidence: 66%
“…Lorber's definition of multivariate sensitivity has been generalized by Faber et al [10] to multilinear data. The last step in (11) has been proved by Bauer et al [13].…”
Section: Generalized Rank Annihilation Methodsmentioning
confidence: 82%
“…14,16 The only difference lies in the way the left and right singular vectors are estimated. In the current method this is done by decomposing the column and row augmented matrices which can be recast into equation (12) using the reconstruction equations for the pure profiles.…”
Section: Generalization Of Wilson Et Almentioning
confidence: 99%
“…* Recently a reformulation of the eigenvalue problem has led to a simplification of the error propagation. 16 Irrespective of the differences in approach taken in these theoretical contributions, they all have in common that the measurement error is assumed to be uncorrelated and homoscedastic. The advantage of this assumption is that it leads to a relatively simple expression for the standard error in the estimated eigenvalue.…”
Section: Introductionmentioning
confidence: 99%
“…Faber, Bro, and Hopke [15] compared ALS with a number of competing algorithms: direct trilinear decomposition (DTLD) [16,17,14,18,29,31,42], alternating trilinear decomposition (ATLD) [50], self-weighted alternating trilinear decomposition (SWATLD) [10,11], pseudo alternating least squares (PALS) [9], alternating coupled vectors resolution (ACOVER) [23], alternating slice-wise diagonalization (ASD) [22], and alternating coupled matrices resolution (ACOMAR) [30]. It is shown that while none of the algorithms is better than ALS in terms of the quality of solution, ASD may be an alternative to ALS when the computation time is a priority.…”
Section: Introductionmentioning
confidence: 99%