2013 18th International Conference on Digital Signal Processing (DSP) 2013
DOI: 10.1109/icdsp.2013.6622725
|View full text |Cite
|
Sign up to set email alerts
|

Tensor dictionary learning with sparse TUCKER decomposition

Abstract: Dictionary learning algorithms are typically derived for dealing with one or two dimensional signals using vector-matrix operations. Little attention has been paid to the problem of dictionary learning over high dimensional tensor data. We propose a new algorithm for dictionary learning based on tensor factorization using a TUCKER model. In this algorithm, sparseness constraints are applied to the core tensor, of which the n-mode factors are learned from the input data in an alternate minimization manner using… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
44
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 66 publications
(44 citation statements)
references
References 9 publications
0
44
0
Order By: Relevance
“…To investigate the classification performance of our proposed algorithm, GT-D, over other tensor decomposition methods, we apply our GT-D on a speaker identification problem and compare the results with baseline TUCKER decomposition algorithms such as standard TUCKER method TALS [3], and the state-of-the-art GT-G [10], HALS [6], TCCD [11] and APG [12] methods. We use all of these algorithm with their default parameters.…”
Section: Experiments and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…To investigate the classification performance of our proposed algorithm, GT-D, over other tensor decomposition methods, we apply our GT-D on a speaker identification problem and compare the results with baseline TUCKER decomposition algorithms such as standard TUCKER method TALS [3], and the state-of-the-art GT-G [10], HALS [6], TCCD [11] and APG [12] methods. We use all of these algorithm with their default parameters.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…For each classification application, the test feature matrix Y test is projected on to each class-specific classifier tensor obtained by (10) to get the coefficients by using the least squares method. The resulting coefficients are used to reconstruct the test feature matrixŶ test i with the help of each D (1) i .…”
Section: Signal Classificationmentioning
confidence: 99%
“…2-KS dictionaries). The model was extended to the 3rd-order (3-KS dictionaries) [12,19] and even for an arbitrary tensor order [4,7] based on the Tucker decomposition, a model coined as Tucker Dictionary Learning. However, none of these works include a sum of Kronecker terms.…”
Section: Related Workmentioning
confidence: 99%
“…These two decompositions can be viewed as generalizations of singular value decomposition (SVD) [33]. With a tensor decomposition method, the conventional dictionary learning (denoted as vectorized dictionary learning, abbreviated as VDL) can be extended into tensor-based dictionary learning (TDL) [34]–[36], which ought to be more powerful in capturing structures and more sparsely representing a multidimensional array. In 2015, we proposed an adaptive tensor-based spatio-temporal dictionary learning method for 4D CT reconstruction [37].…”
Section: Introductionmentioning
confidence: 99%