2019
DOI: 10.1007/s11042-019-08345-y
|View full text |Cite
|
Sign up to set email alerts
|

An eight-layer convolutional neural network with stochastic pooling, batch normalization and dropout for fingerspelling recognition of Chinese sign language

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
19
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 33 publications
(20 citation statements)
references
References 35 publications
0
19
0
1
Order By: Relevance
“…For example, batch normalization (BN) is introduced to ensure that the each layer input can be kept uniformly distributed during training. Dropout technology is employed to obtain a refined network with fewer nodes by dropping appropriate neurons based on probability P, thereby effectively reducing overfitting and achieving regularization to a certain extent [70]. ReLU function is focused on because it is easier to accelerate the convergence rate of stochastic gradient descent than traditional activation functions [71].…”
Section: Other Classification Approachesmentioning
confidence: 99%
See 4 more Smart Citations
“…For example, batch normalization (BN) is introduced to ensure that the each layer input can be kept uniformly distributed during training. Dropout technology is employed to obtain a refined network with fewer nodes by dropping appropriate neurons based on probability P, thereby effectively reducing overfitting and achieving regularization to a certain extent [70]. ReLU function is focused on because it is easier to accelerate the convergence rate of stochastic gradient descent than traditional activation functions [71].…”
Section: Other Classification Approachesmentioning
confidence: 99%
“…ReLU function is focused on because it is easier to accelerate the convergence rate of stochastic gradient descent than traditional activation functions [71]. Data augmentation (DA) provides multiple augment methods, including PCA color enhancement, affine transformation, noise injection, gamma correction, random shifting, and scaling, which will effectively enlarge the dataset and be helpful to alleviate overfitting [70].…”
Section: Other Classification Approachesmentioning
confidence: 99%
See 3 more Smart Citations