2018
DOI: 10.1109/access.2018.2813395
|View full text |Cite
|
Sign up to set email alerts
|

The Excellent Properties of a Dense Grid-Based HOG Feature on Face Recognition Compared to Gabor and LBP

Abstract: To effectively represent facial features in complex environments, a face recognition method based on dense grid histograms of oriented gradients (HOG) is proposed. First, the face image is divided by numerous dense grids from which the HOG features are extracted. Then, all the grid HOG feature vectors are composed to realize the feature expression of the whole face, and the nearest neighbor classifier is used for recognition. In the FERET face database with complex changes in illumination, time, and environmen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 54 publications
(21 citation statements)
references
References 33 publications
0
21
0
Order By: Relevance
“…A face recognition method has been presented based on dense grid histograms of oriented gradients (HOG) [5]. In that study, the face image has been divided into many dense grids from which the HOG features have been extracted.…”
Section: Related Workmentioning
confidence: 99%
“…A face recognition method has been presented based on dense grid histograms of oriented gradients (HOG) [5]. In that study, the face image has been divided into many dense grids from which the HOG features have been extracted.…”
Section: Related Workmentioning
confidence: 99%
“…Feature descriptors are also applied as methods for face recognition. For instance, [Xiang, Tan and Ye 2018] makes a comparison between three famous facial feature descriptors, Histograms of Oriented Gradients (HOG), Gabor and Local Binary Pattern (LBP), where it presents the advantages and disadvantages of using each method.…”
Section: Related Workmentioning
confidence: 99%
“…Although there were many classifiers used for the classification, such as the Euclidean Distance, the Cosine Distance, Linear Discriminant Analysis, Quadratic Discriminant Analysis, Learning Vector Quantization, and Support Vector Machines [69]. The Minimum Euclidean Distance classifier was considered to be one of the most popular classifiers that could be easily designed [70] and widely used [71,72]. In general, it was used to examine the similarities between objects.…”
Section: Classificationmentioning
confidence: 99%