2019
DOI: 10.48550/arxiv.1911.03813
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Estimating Higher-Order Moments Using Symmetric Tensor Decomposition

Samantha Sherman,
Tamara G. Kolda

Abstract: We consider the problem of decomposing higher-order moment tensors, i.e., the sum of symmetric outer products of data vectors. Such a decomposition can be used to estimate the means in a Gaussian mixture model and for other applications in machine learning. The dth-order empirical moment tensor of a set of p observations of n variables is a symmetric d-way tensor. Our goal is to find a low-rank tensor approximation comprising r ! p symmetric outer products. The challenge is that forming the empirical moment te… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 27 publications
(41 reference statements)
0
2
0
Order By: Relevance
“…Another aspect we would like to study is the possible extension of SPM to implicit settings, as in [62]. Here a low-rank symmetric tensor arises as the population moment of a latent variable model (for example, a mixture of Gaussians).…”
Section: Discussionmentioning
confidence: 99%
“…Another aspect we would like to study is the possible extension of SPM to implicit settings, as in [62]. Here a low-rank symmetric tensor arises as the population moment of a latent variable model (for example, a mixture of Gaussians).…”
Section: Discussionmentioning
confidence: 99%
“…Another common use of GMMs in the literature, similar to our application here, is as test-cases for a learning algorithm intended to solve a more general problem. Examples include clustering; see, e.g., Jiang et al (2019) and Panahi et al (2017) and tensor factorization; see, e.g., Sherman and Kolda (2019).…”
Section: Introductionmentioning
confidence: 99%