2019
DOI: 10.1016/j.jcp.2019.03.003
|View full text |Cite
|
Sign up to set email alerts
|

Anomaly detection in scientific data using joint statistical moments

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
20
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 25 publications
(21 citation statements)
references
References 24 publications
0
20
0
1
Order By: Relevance
“…and is of interest in different contexts. [16][17][18][19][20] Expanding the numerator for n = m = 2 yields…”
Section: Pairwise Correlationmentioning
confidence: 99%
See 1 more Smart Citation
“…and is of interest in different contexts. [16][17][18][19][20] Expanding the numerator for n = m = 2 yields…”
Section: Pairwise Correlationmentioning
confidence: 99%
“…The cokurtosis [ 14,15 ] of two variables is defined as Cokfalse(x1,x2false)=Efalse(x1x1false)nfalse(x2x2false)mσx1nσx2mand is of interest in different contexts. [ 16–20 ] Expanding the numerator for n=m=2 yields trueleftx12x22Cov(x12,x2)x2Cov(x1,x22)x1+4Cov(x1,x2)x1x2rightx12x2x2x1x22x1+false⟨x1false⟩2false⟨x2false⟩2For symmetrical distributions, the first term is the only one which survives Cokfalse(x1,x2…”
Section: Introductionmentioning
confidence: 99%
“…Blockwise distribution based visualization and analysis has been applied in multiple contexts, including feature tracking [17], anomaly data [1], and in situ summarization in several different domains [16,52]. The blockwise distribution driven analysis package computes distribution functions for a given set of variables, over each subblock of the domain decomposition of a distributed Cartesian volume.…”
Section: Blockwise Distribution Driven Analysismentioning
confidence: 99%
“…Tensor decomposition of moment and cumulant tensors are used in a variety of statistical and data science applications, including independent component analysis and blind source separation [10,13,15], clustering [34,12], learning Gaussian mixture models [23,4,20,18,33], latent variable models [2,3], outlier detection [16,1], feature extraction in hyperspectral imagery [19], and multireference alignment [32]. In these cases, it is assumed that the empirical higher-order moment is already computed.…”
mentioning
confidence: 99%
“…For d " 3 and n " 1000, X requires 8 GB of storage; for d " 4 and n " 200, X requires 12 GB of storage. 1 The per iteration storage and floating point operations per iteration cost to compute a rank r " 10 approximation are shown in Figure 1.2 for different values of d and n.…”
mentioning
confidence: 99%