2020
DOI: 10.1137/20m1322029
|View full text |Cite
|
Sign up to set email alerts
|

Compressed Principal Component Analysis of Non-Gaussian Vectors

Abstract: A novel approximate representation of non-Gaussian random vectors is introduced and validated, which can be viewed as a Compressed Principal Component Analysis (CPCA). This representation relies on the eigenvectors of the covariance matrix obtained as in a Principal Component Analysis (PCA) but expresses the random vector as a linear combination of a random sample of N of these eigenvectors. In this model, the indices of these eigenvectors are independent discrete random variables with probabilities proportion… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 52 publications
0
2
0
Order By: Relevance
“…It is not possible here to propose a review of all these methods, knowing that the use of each requires a precise analysis of its domain of validity, which depends on the hypotheses with which it was constructed. Nevertheless, we refer the reader to [74] for methods based on probability theory and mathematical statistics, including polynomial chaos expansion methodology, to [75,76,77,78,79,80] for surrogate based modeling, to [81,82,83,27,84,85,86,87,88] for projectionbased model reduction, and to [89,90,91,92,93] for optimization of expensive functions.…”
Section: Probabilistic Learning On Manifolds (Plom) Used As a Machine...mentioning
confidence: 99%
“…It is not possible here to propose a review of all these methods, knowing that the use of each requires a precise analysis of its domain of validity, which depends on the hypotheses with which it was constructed. Nevertheless, we refer the reader to [74] for methods based on probability theory and mathematical statistics, including polynomial chaos expansion methodology, to [75,76,77,78,79,80] for surrogate based modeling, to [81,82,83,27,84,85,86,87,88] for projectionbased model reduction, and to [89,90,91,92,93] for optimization of expensive functions.…”
Section: Probabilistic Learning On Manifolds (Plom) Used As a Machine...mentioning
confidence: 99%
“…The PCE with random coefficients were explored by [8,1,9], while Tipireddy presented a basis adaptation in homogeneous chaos spaces [10]. A compressed principal component analysis of non-Gaussian vectors using symmetric polynomial chaos was proposed by Mignolet [11]. Significant works are also devoted to the acceleration of stochastic convergence of PCE [12,13,14,10,15].…”
Section: Introductionmentioning
confidence: 99%