2007
DOI: 10.1007/s00180-007-0051-2
|View full text |Cite
|
Sign up to set email alerts
|

Computational considerations in functional principal component analysis

Abstract: Functional data analysis, Hilbert spaces, Principal components, Covariance estimation, Orthogonal projection,

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
18
0

Year Published

2008
2008
2024
2024

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 34 publications
(18 citation statements)
references
References 11 publications
0
18
0
Order By: Relevance
“…Theoretical and asymptotic properties of FPCA for Hilbert-valued random functions were studied in [50][51][52][53][54]. In the case of a basis expansion for each functional variable (see Equation ( 1)), each functional PCA is equivalent to the multivariate PCA of the matrix AΨ 1/2 , with A = (a ij ) being the n × p matrix of basis coefficients and Ψ being the p × p matrix of inner products between basis functions, Ψ = (Ψ ij ) =< φ i , φ j >, i, j = 1, ..., p. The vector of basis coefficients of the the l−th PC weight function f l (t) is given by b l = Ψ −1/2 v l , where v l is the l−th eigenvector of the sample covariance matrix of AΨ 1/2 (see [55] for a detailed study).…”
Section: Functional Principal Component Regressionmentioning
confidence: 99%
“…Theoretical and asymptotic properties of FPCA for Hilbert-valued random functions were studied in [50][51][52][53][54]. In the case of a basis expansion for each functional variable (see Equation ( 1)), each functional PCA is equivalent to the multivariate PCA of the matrix AΨ 1/2 , with A = (a ij ) being the n × p matrix of basis coefficients and Ψ being the p × p matrix of inner products between basis functions, Ψ = (Ψ ij ) =< φ i , φ j >, i, j = 1, ..., p. The vector of basis coefficients of the the l−th PC weight function f l (t) is given by b l = Ψ −1/2 v l , where v l is the l−th eigenvector of the sample covariance matrix of AΨ 1/2 (see [55] for a detailed study).…”
Section: Functional Principal Component Regressionmentioning
confidence: 99%
“…For computational aspects of FPCA, see Yao and Lee who suggested an iterative procedure for estimating functional PCs, using penalized spline regression for updating the mean function within the iteration process, and Ocana et al . who focused on an algorithm for computing estimates of FPCA, that is based on classic multivariate PCA settings. Contributions to various aspects of FPCA include those of Boente and Fraiman who discussed kernel‐based smoothed functional PCs, Cardot who extended FPCA to nonparametric functional mixed effect models, called conditional FPCA, and Mas who considered local FPCA in which local covariance operators are used.…”
Section: Functional Principal Component Analysismentioning
confidence: 99%
“…Functional PCA and functional PLS regression were introduced as natural extensions of their multivariate counterparts to solve the problems of high dimension and multicollinearity associated with the scalar-on-function linear model (Deville, 1974;Dauxois et al, 1982;Ocaña et al, 1999Ocaña et al, , 2007Preda and Saporta, 2005;Aguilera et al, 2016). Both methodologies were compared on different simulated datasets concluding that they have similar forecasting performance, but the estimated parameter function provided by functional PLS regression is more accurate and needs fewer components (Reiss and Ogden, 2007;Aguilera et al, 2010;Delaigle and Hall, 2012b,c).…”
Section: Introductionmentioning
confidence: 99%