2019
DOI: 10.1109/lsp.2019.2939986
|View full text |Cite
|
Sign up to set email alerts
|

Two-Directional Two-Dimensional Kernel Canonical Correlation Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…This is substantially less than 40, the number of classes. Next, these samples are utilized to construct the first view data set and the twice wavelet transform [34] is performed on each sample in the first data set to generate the corresponding second data set. Moreover, two experimental settings are adopted, front-left and front-right.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…This is substantially less than 40, the number of classes. Next, these samples are utilized to construct the first view data set and the twice wavelet transform [34] is performed on each sample in the first data set to generate the corresponding second data set. Moreover, two experimental settings are adopted, front-left and front-right.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…Each subject provides three samples with a size of 20 × 20 pixels according to three different poses (front, left and right). Then, all 600 samples are utilized to construct the 2D data set X and the wavelet transform [10] is performed twice on each sample in the data set X to generate the corresponding 2D data set Y . Moreover, two experimental settings are adopted, front-left and front-right.…”
Section: B Experiments Under Different Poses On the Feret Databasementioning
confidence: 99%
“…Assume we have N two dimensional samples with a size of m × n. Then the computational complexity of the traditional CCA is on the order O((mn) 3 ) while 2DCCA only requires a computational complexity of O((m) 3 ) or O((n) 3 ). Moreover, local two-dimensional canonical correlation analysis (L2DCCA) [17] and two-directional two-dimensional kernel canonical correlation analysis ((2D) 2 KCCA) [10] are proposed as an extension of 2DCCA. In L2DCCA, the local structural information is introduced to the 2DCCA space, revealing more useful representations between 2D data sets with the computational complexity on the order of O((m…”
Section: Introductionmentioning
confidence: 99%
“…In pattern recognition and machine learning, canonical correlation analysis (CCA) is a typical method used to analyze correlations between two or more types of feature views of a given dataset [1], [2], and is widely adopted to extract related representations from individual views and fuse them together for pattern classification tasks [3], [4], [5], [6], [7]. Specially, given two (or more) views of feature representation of interested data, traditional CCA aims to find corresponding projection directions for individual views, along which the correlations of the views are maximized.…”
Section: Introductionmentioning
confidence: 99%