2016
DOI: 10.1109/tsp.2016.2612171
|View full text |Cite
|
Sign up to set email alerts
|

Polar $n$-Complex and $n$-Bicomplex Singular Value Decomposition and Principal Component Pursuit

Abstract: Abstract-Informed by recent work on tensor singular value decomposition and circulant algebra matrices, this paper presents a new theoretical bridge that unifies the hypercomplex and tensor-based approaches to singular value decomposition and robust principal component analysis. We begin our work by extending the principal component pursuit to Olariu's polar ncomplex numbers as well as their bicomplex counterparts. In so doing, we have derived the polar n-complex and n-bicomplex proximity operators for both th… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2017
2017
2025
2025

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(9 citation statements)
references
References 37 publications
0
9
0
Order By: Relevance
“…Definition 2.2. (identity tensor) [6] The identity tensor I ∈ R n×n×n 3 is the tensor with I (1) being the n × n identity matrix, and other frontal slices being zeros.…”
Section: Definitions and Propositionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Definition 2.2. (identity tensor) [6] The identity tensor I ∈ R n×n×n 3 is the tensor with I (1) being the n × n identity matrix, and other frontal slices being zeros.…”
Section: Definitions and Propositionsmentioning
confidence: 99%
“…Several forms of the regularization operator L are presented in [12,14] and here we focus on L = I, an identity tensor throughout this paper. The operator * denotes the tensor-tensor t-product introduced in the seminal work [6,7], which has been proved to be a useful tool with a large number of applications, such as image processing [6,11,16,18], signal processing [1,9,10], tensor recovery and robust tensor PCA [8,9], data completion and denoising [4,10,19]. Compared to the tensor least squares (t-LS) of the form min…”
Section: Introductionmentioning
confidence: 99%
“…then the T-characteristic polynomial P T (x) has the expression P T (x) := LCM(P D 1 (x), P D 2 (x), · · · , P Dp (x)), (7) where 'LCM' means the least common multiplier, {1, 2, . .…”
Section: T-characteristic and T-minimal Polynomialmentioning
confidence: 99%
“…On the other hand, the tensor T-product introduced by Kilmer [21] has been proved to be of great use in many areas, such as image processing [21,29,35,37], computer vision [12], signal processing [7,24,25], low rank tensor recovery and robust tensor PCA [23,24], data completion and denoising [16,25,38]. An approach of linearization is provided by the T-product to transfer tensor multiplication to matrix multiplication by the discrete Fourier transformation and the theories of block circulant matrices [6,19].…”
Section: Introductionmentioning
confidence: 99%
“…For p=q=0, we define A0,0 as the number of non‐zero entries in A. The hard‐thresholding operator is defined as [53]scriptHεfalse(xfalse)={1em4ptxfalsefalse|xfalsefalse|>ϵ,0otherwise.The shrinkage thresholding operator is defined as [54, 55]scriptSϵfalse(xfalse)={right leftthickmathspace.5emxϵx>ϵ,x+ϵx<ϵ,0otherwise.…”
Section: Mathematical Preliminariesmentioning
confidence: 99%