2017
DOI: 10.1117/1.jei.26.2.023011
|View full text |Cite
|
Sign up to set email alerts
|

Local intensity area descriptor for facial recognition in ideal and noise conditions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 21 publications
0
9
0
Order By: Relevance
“…Generally, the keypoint is coupled with a descriptor. In this work, a sparse descriptor, which first detects the points of interest in a given image and then samples a local patch and describes its invariant features [28], is employed to produce the keypoint.…”
Section: Keypoint Detectionmentioning
confidence: 99%
“…Generally, the keypoint is coupled with a descriptor. In this work, a sparse descriptor, which first detects the points of interest in a given image and then samples a local patch and describes its invariant features [28], is employed to produce the keypoint.…”
Section: Keypoint Detectionmentioning
confidence: 99%
“…In order to compare the efficiency of proposed and conventional methods, LBP [11] descriptor was used to represent the face image and histogram-based feature was extracted from obtained images. The chi-square distance [6], [11], [12] was chosen for nearest neighbor classifier. The conventional method (CM) is a face recognition method using nearest neighbor classifier.…”
Section: B Experimental Settingsmentioning
confidence: 99%
“…For example, the process for 1-N-N to make a classification is as follows: in order to classify x from the test data, it will find its closest neighbour within the training data, labelling it x^i and then assign x the value of x^i [17]. Works such as Guru et al [17] and Tran et al [18] show us that the k-N-N classifier is a useful function for benchmarking the performance of feature extraction and recognition systems. The k-N-N classifier was also selected because it is a simple to configure, parameter free classifier, that can efficiently test the quality of features.…”
Section: A K-n-n Classifiermentioning
confidence: 99%