2011
DOI: 10.1016/j.sigpro.2011.02.003
|View full text |Cite
|
Sign up to set email alerts
|

Iterative methods for the canonical decomposition of multi-way arrays: Application to blind underdetermined mixture identification

Abstract: 14 pagesInternational audienceTwo main drawbacks can be stated in the alternating least square (ALS) algorithm used to fit the canonical decomposition (CAND) of multi-way arrays. First its slow convergence caused by the presence of collinearity between factors in the multi-way array it decomposes. Second its blindness to Hermitian symmetries of the considered arrays. Enhanced line search (ELS) scheme was found to be a good way to cope with the slow convergence of the ALS algorithm together with a partial use o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 29 publications
(18 citation statements)
references
References 47 publications
0
18
0
Order By: Relevance
“…Second, it is strictly convex with respect to the linear combination of the independent sources. Then, SIM-BEC solves the ICA problem by looking for the maxima of that cumulant index-based objective function [25 [32] methods also perform ICA using cumulants of the data [18]. SOBI, SOBI rob and TFBSS use SO cumulants, CoM 2 and JADE use both the SO and FO cumulants, and FOBIUM JAD , ICAR 3 , FOOBI 1 and 4-CANDHAP c only use the FO cumulants of the data.…”
Section: Statistical Tools Characterizing Mutual Independencementioning
confidence: 99%
See 1 more Smart Citation
“…Second, it is strictly convex with respect to the linear combination of the independent sources. Then, SIM-BEC solves the ICA problem by looking for the maxima of that cumulant index-based objective function [25 [32] methods also perform ICA using cumulants of the data [18]. SOBI, SOBI rob and TFBSS use SO cumulants, CoM 2 and JADE use both the SO and FO cumulants, and FOBIUM JAD , ICAR 3 , FOOBI 1 and 4-CANDHAP c only use the FO cumulants of the data.…”
Section: Statistical Tools Characterizing Mutual Independencementioning
confidence: 99%
“…Next, representative methods of two classes, including the most used ICA techniques in signal processing, are briefly described and studied in terms of performance and numerical complexity: techniques based on the Differential Entropy (DE) such as (extended) InfoMax [16,17], PICA [19] and two different implementations of FastICA [18, ch.6] versus cumulant-based methods. Among cumulantbased techniques, representative algorithms of three subfamilies are studied: i) the techniques using only SO statistics of the data such as SOBI [14,15], SOBI rob [20], TFBSS [21], ii) the algorithms based on SO and FO statistics such as JADE [22], CoM 2 [23], and iii) the methods requiring only HO statistics such as ERICA [24], SIMBEC [25], FOBIUM JAD [26,27], ICAR 3 [28,29], FOOBI 1 [30], 4-CANDHAP c [31,32]. Quantitative results are obtained on simulated epileptic data generated with a physiologically-plausible model [33][34][35].…”
Section: Introductionmentioning
confidence: 99%
“…CP algorithms available in the literature compute all for , either sequentially (in alternating algorithms [1], [2], [5], [19], [42], [45], [46]) or simultaneously (as in all-atonce algorithms [21], [24], [26]- [29], line-search [20], [21]). This section will present a fast method to compute the gradients recursively for .…”
Section: Progressive Computation Of All Mode Cp Gradientsmentioning
confidence: 99%
“…This confirms that FastALS not only has a lower computational cost, but also requires less memory than CP_ALS. Other alternating algorithms for CPD with/without additional constrains such as nonnegativity, orthogonality [38], [45], [46], [48], [49] can be accelerated in a similar way.…”
Section: Progressive Computation Of All Mode Cp Gradientsmentioning
confidence: 99%
“…On the other hand, in the (2D) matrix case, very efficient low-rank constrained matrix factorization techniques (also called penalized matrix factorizations) have been developed, such as principal component analysis (PCA), ICA [25][26][27][28], sparse component analysis (SCA) [29,30], smooth component analysis (SmoCA) [5] and nonnegative matrix factorization (NMF) [5,31], just to name a few. These matrix factorization techniques have their own bias, advantages, and are widely applied to blind source separation (BSS), dimensionality reduction, data compression and feature extraction, by exploiting various assumptions and a priori knowledge.…”
Section: Introduction and Problem Statementmentioning
confidence: 99%