2021
DOI: 10.1007/s13369-021-05471-4
|View full text |Cite
|
Sign up to set email alerts
|

An improved faster-RCNN model for handwritten character recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
34
0
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

4
5

Authors

Journals

citations
Cited by 71 publications
(36 citation statements)
references
References 68 publications
0
34
0
2
Order By: Relevance
“…The proposed feature computer DenseNet-77 contains a small number of parameters with thinner layers network as compare to ResNet-101 which gives a computational benefit over ResNet-101. DenseNet has numerous dense blocks (DBs) that are consecutively linked with each other via employing added convolutional and pooling layers in sequential dense blocks [51,52]. DenseNet model can show the complicated transformation efficiently which assists in dealing with the problem of the lack of the object's location data for the significant features to some extent.…”
Section: Custom Centernetmentioning
confidence: 99%
“…The proposed feature computer DenseNet-77 contains a small number of parameters with thinner layers network as compare to ResNet-101 which gives a computational benefit over ResNet-101. DenseNet has numerous dense blocks (DBs) that are consecutively linked with each other via employing added convolutional and pooling layers in sequential dense blocks [51,52]. DenseNet model can show the complicated transformation efficiently which assists in dealing with the problem of the lack of the object's location data for the significant features to some extent.…”
Section: Custom Centernetmentioning
confidence: 99%
“…The drawback of this is that it is computationally complicated, i.e., it necessitates a vast number of parameters (187M parameters) and memory resources that unsurprisingly lead to low speed. To decrease the complexity, we adopt the DenseNet-41 [61,62] as the feature extractor to improve the extraction of features. The DenseNet-41 involves four densely correlated units with 41 layers and contains fewer parameters which provides it with a computational improvement than the Hourglass104.…”
Section: Feature Extraction Using Customized Backbone Networkmentioning
confidence: 99%
“…Transition layer 2 16 16 150 The extensive dense links increase FMs; therefore, the transition layer is introduced to reduce the key-points dimension from the previous DB, as discussed in [32,33]. The computed features from DenseNet-100 are down-sampled with the stride rate R = 4 and are then passed to compute three types of heads, described in the following sections.…”
Section: Feature Extraction Using Densenet-100mentioning
confidence: 99%