2016
DOI: 10.1109/lsp.2016.2577383
|View full text |Cite
|
Sign up to set email alerts
|

Partitioned Alternating Least Squares Technique for Canonical Polyadic Tensor Decomposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(14 citation statements)
references
References 12 publications
0
14
0
Order By: Relevance
“….Using the optimization problem of the MKLMF algorithm (15) in this paper, we study the linear combination of 3 base kernels. At the same time, we use 3 base cores to test the KMF algorithm proposed in this paper.…”
Section: Results Analysismentioning
confidence: 99%
See 1 more Smart Citation
“….Using the optimization problem of the MKLMF algorithm (15) in this paper, we study the linear combination of 3 base kernels. At the same time, we use 3 base cores to test the KMF algorithm proposed in this paper.…”
Section: Results Analysismentioning
confidence: 99%
“… Representing Frobenius 2 -norm;  a normalization term that can avoid overfitting. These problems are non-convex and can be solved by gradient descent algorithm and alternative Least Square (ALS) [15]:Assuming that U is fixed, V is obtained by solving equation (2).At this point, the problem can be decomposed into N separate ridge regression problems [16]. For the j column of V , the ridge regression problem can be expressed as follows:…”
mentioning
confidence: 99%
“…Felten, et al, [35] The goal of Alternating least squares (ALS) algorithm is to update each matrix factor alternately in each iteration by solving quadratic problems. Tichavský, et al, [36] Alternating least squares (ALS) algorithm worked as multiple matrices A, B, C which resolved to find a quadratic matrix A, B and C in balanced way. Similarly, when looking for the quadratic form of B, then C and A are equal, and when looking for the quadratic form of C, then A and B are equal.…”
Section: Where Representmentioning
confidence: 99%
“…Tensor data is conventionally still needed to estimate the filtering about the data mismatch and estimation error. In this study, they are implemented together with other algorithms [36].Therefore, we added Alternating Least Squares (ALS) as an embedded algorithm capable of calculating customer and menu matrices that are considered not sequential and have many negative values that must be converted to quadratic form without eliminating back and forth mode. Previous studies [40] [36] [35] reported that ALS is useful for overcoming convexity or bulging coordinates towards the middle or to the edge of void values.…”
Section: C5 Calculations By Tensor Alternating Least Squares (Als)mentioning
confidence: 99%
“…To obtain the CPD, we can minimize the Frobenius norm of the error of the tensor Y and its estimate, e.g., using the Alternating Least Squares (ALS) algorithm to update sequentially the factor matrices [7][8][9][10], or the non-linear conjugate gradient method [11,12], the Levenberg-Marquardt (LM) algorithm [13,14], the Krylov LM algorithm [15,16], the nonlinear least squares (NLS) algorithm [17] to update all the parameters at a time. In practice, real-world data does not exactly admit the low-rank CPD.…”
Section: Introductionmentioning
confidence: 99%