2015
DOI: 10.1109/tsp.2015.2469642
|View full text |Cite
|
Sign up to set email alerts
|

Joint Tensor Factorization and Outlying Slab Suppression With Applications

Abstract: We consider factoring low-rank tensors in the presence of outlying slabs. This problem is important in practice, because data collected in many real-world applications, such as speech, fluorescence, and some social network data, fit this paradigm. Prior work tackles this problem by iteratively selecting a fixed number of slabs and fitting, a procedure which may not converge. We formulate this problem from a group-sparsity promoting point of view, and propose an alternating optimization framework to handle the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
31
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 57 publications
(33 citation statements)
references
References 49 publications
2
31
0
Order By: Relevance
“…This is the main attraction of the AO-ADMM approach of [102], which can also deal with more general loss functions and missing elements, while maintaining the monotone decrease of the cost and conceptual simplicity of ALS. The AO-ADMM framework has been recently extended to handle robust tensor factorization problems where some slabs are grossly corrupted [106].…”
Section: G Constraintsmentioning
confidence: 99%
“…This is the main attraction of the AO-ADMM approach of [102], which can also deal with more general loss functions and missing elements, while maintaining the monotone decrease of the cost and conceptual simplicity of ALS. The AO-ADMM framework has been recently extended to handle robust tensor factorization problems where some slabs are grossly corrupted [106].…”
Section: G Constraintsmentioning
confidence: 99%
“…This suggests that it can also be beneficial to perform simultaneous clustering and factorization. This idea is explored by the authors of Ref , where they demonstrate the effectiveness of such an approach.…”
Section: Tensor‐based Models In Rsmentioning
confidence: 99%
“…within the processed tensor (also known as outliers), whether implemented by means of HOSVD, or HOOI [18]- [20]. The same sensitivity has also been documented in PCA, which is a special case of Tucker for 2-way tensors (matrices).…”
mentioning
confidence: 70%
“…Large datasets often contain heavily corrupted, outlying entries due to various causes, such as sensor malfunctions, errors in data storage/transfer, heavy-tail noise, intermittent variations of the sensing environment, and even intentional dataset "poisoning" [31]. Regretfully, such corruptions that lie far from the sought-after subspaces are known to significantly affect PCA and its multi-way generalization, Tucker, even when they appear as a small fraction of the processed data [18], [20], [21]. Accordingly, in such cases, the performance of any application that relies on PCA and Tucker can be compromised.…”
Section: Data Corruption and L1-norm Reformulation Of Pcamentioning
confidence: 99%