2021
DOI: 10.1214/21-ejs1803
|View full text |Cite
|
Sign up to set email alerts
|

Principal component analysis for multivariate extremes

Abstract: In the probabilistic framework of multivariate regular variation, the first order behavior of heavy-tailed random vectors above large radial thresholds is ruled by a homogeneous limit measure. For a high dimensional vector, a reasonable assumption is that the support of this measure is concentrated on a lower dimensional subspace, meaning that certain linear combinations of the components are much likelier to be large than others. Identifying this subspace and thus reducing the dimension will facilitate a refi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 23 publications
(19 citation statements)
references
References 31 publications
1
18
0
Order By: Relevance
“…Furthermore, the angular measure is helpful for solving supervised and unsupervised learning tasks for sample points far away from the center of the distribution. In the spirit of principal component analysis, the eigendecomposition of the Gram matrix of the angular measure yields low-dimensional summaries of extremal dependence [11,19]. Anomalous data can be detected from unusual combinations of variables being large simultaneously [31] or from their lack of membership of minimum-volume sets of the unit sphere containing a large fraction of the total mass of the angular measure [47].…”
Section: Learning From Multivariate Extremesmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, the angular measure is helpful for solving supervised and unsupervised learning tasks for sample points far away from the center of the distribution. In the spirit of principal component analysis, the eigendecomposition of the Gram matrix of the angular measure yields low-dimensional summaries of extremal dependence [11,19]. Anomalous data can be detected from unusual combinations of variables being large simultaneously [31] or from their lack of membership of minimum-volume sets of the unit sphere containing a large fraction of the total mass of the angular measure [47].…”
Section: Learning From Multivariate Extremesmentioning
confidence: 99%
“…The upper confidence bound at level 1 − δ for the maximal deviation (1) stated in Theorem 3.1 below is derived from the decomposition (19) with framing sets Γ + A (r + , h) and Γ − A (r − , h) for specific choices of r + , r − and h. To control the stochastic error, we need a handle on the complexity of the collection of framing sets. The Vapnik-Chervonenkis (VC) dimension of a collection F of subsets of some set X is the supremum (possibly infinite) of the set of positive integers n with the property that there exists a subset {x 1 , .…”
Section: Concentration Bounds For the Empirical Angular Measurementioning
confidence: 99%
“…Hence, our generalized PCA can be interpreted as searching a best "hyperplane" approximation in a kind of Fourier space. This contrasts in particular the approaches so far for extreme values (Jiang et al, 2020;Drees and Sabourin, 2021). The so-called spectral PCA (Thornhill et al, 2002) is somehow close to our approach, but sticks to the idea of classic PCA: data of stationary processes are first Fourier transformed, before they enter the classic PCA.…”
Section: Example: Max-stable Distributionsmentioning
confidence: 73%
“…analyzing the extremes of a high dimensional random vector. Such studies can be divided into the following categories: clustering methods [6,7,39], support identification, [32,33,8,9,53,44], Principal Component Analysis of the angular component of extremes [14,41,20], and graphical models for extremes [36,23,1]; see also [24] and the references therein. Our approach is remotely related to the last category: extremal graphical models.…”
Section: Dimensionality Reduction In Evtmentioning
confidence: 99%