2015
DOI: 10.1007/s00500-015-1794-2
|View full text |Cite
|
Sign up to set email alerts
|

Fisher discriminant analysis based on kernel cuboid for face recognition

Abstract: This paper builds the concept of kernel cuboid, and proposes a new kernel-based image feature extraction method for face recognition. The proposed method deals with a face image in a block-wise manner, and independently performs kernel discriminant analysis in every block set, using kernel cuboid instead of kernel matrix. Experimental results on the ORL and UMIST face databases show the effectiveness and scalability of the proposed method.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 28 publications
0
4
0
Order By: Relevance
“…If M 1 , N 1 , and P 1 are connected counterclockwise, we establish a 2D coordinate system X O Y as shown in Figure 7, p i and q i mean the corresponding lengths of 2D coordinates. The coordinate value of the i-th 2D T i is as shown in (14).…”
Section: Side Of the Oriented Bounding Boxmentioning
confidence: 99%
See 1 more Smart Citation
“…If M 1 , N 1 , and P 1 are connected counterclockwise, we establish a 2D coordinate system X O Y as shown in Figure 7, p i and q i mean the corresponding lengths of 2D coordinates. The coordinate value of the i-th 2D T i is as shown in (14).…”
Section: Side Of the Oriented Bounding Boxmentioning
confidence: 99%
“…From the set of nonlinear algorithms that retain the global features, several algorithms use kernel technology such as Kernel Principal Component Analysis [12] and Kernel Fisher Discriminant Analysis [13], which can be seen as nonlinear versions of classic algorithms such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). Another type of dimensionality reduction algorithms is based on distance retention and includes Multiple Dimensional Scaling [14] (MDS) which keeps the distance between the original spatial samples and the distance in the lowdimensional space as much as possible, Isometric Mapping [15] (Isomap) which is a non-iterative globally optimized dimensionality reduction algorithm that requires the geodesic distance to remain constant before and after dimensional reduction, and Diffusion Maps [16] which is dimensionally reduced by diffusion processes to remove redundant information.…”
Section: Introductionmentioning
confidence: 99%
“…With the rapid development of computer science and technology, multivariable statistical process monitoring (MSPM) has become the primary tool for application in this field [1][2][3][4][5][6][7][8]. MSPM, such as principal comp onent analysis (PCA) [9], partial least squares [10], indepen dent component analysis [11] and Fisher discriminant analysis [12], is suitable for massive data, and its extensions have been extensively applied to chemical production processes. In addi tion, machinelearningbased methods, such as the knearest neighbor and support vector machine, perform well in fault detection [13][14][15].…”
Section: Introductionmentioning
confidence: 99%
“…Many dimensionality reduction methods have been proposed, and most of them have been successfully applied to facerecognition problems [1][2][3][4]. The most popular algorithms are principal component analysis (PCA) [5], linear discriminant analysis (LDA) [6], and independent component analysis [7]. Recently, many methods are proposed to improve the above traditional dimensionality reduction methods.…”
Section: Introductionmentioning
confidence: 99%