2020
DOI: 10.1016/j.jksuci.2017.10.010
|View full text |Cite
|
Sign up to set email alerts
|

An efficient face recognition method using contourlet and curvelet transform

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 28 publications
(18 citation statements)
references
References 20 publications
0
18
0
Order By: Relevance
“…A dimensional reduction of space per function was proposed according to the entropy of the conversion coefficients. Selected features were used to recognize the face images using a support vector machine (SVM) classifier [ 8 ].…”
Section: Introductionmentioning
confidence: 99%
“…A dimensional reduction of space per function was proposed according to the entropy of the conversion coefficients. Selected features were used to recognize the face images using a support vector machine (SVM) classifier [ 8 ].…”
Section: Introductionmentioning
confidence: 99%
“…The framework consists of a facial detection and recognition tracking application module, and a data analysis cloud storage module. Biswas and Sil [36] proposed a method for face recognition using contourlet transform (CNT) and curvelet transform (CLT) which improves the rate of face recognition under different challenges. Cheng et al [37] proposed an effective illumination estimation model based on Lambertian reflectance to extract illumination invariants for face recognition under complex illumination conditions.…”
Section: Global Approachmentioning
confidence: 99%
“…S. Biswas and J. Sil, [18] developed a new method for FR to increase the recognition rate specifically Contourlet Transform (CNT) and Curvelet Transform (CLT) methods. The proposed method provided two advantages such as (i) extract the highly correlated information on statistical features of different directional sub bands by CNT.…”
Section: Literature Reviewmentioning
confidence: 99%