2018
DOI: 10.1007/s10588-018-9265-9
|View full text |Cite
|
Sign up to set email alerts
|

Meta features-based scale invariant OCR decision making using LSTM-RNN

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 26 publications
(22 citation statements)
references
References 24 publications
0
22
0
Order By: Relevance
“…CNN has been widely used for solving different problems in different areas [ 17 , 18 ] but, for the processing of images for health applications, its performance is remarkable. A lot of research exists in which CAD-based diagnosis of diseases is proposed.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…CNN has been widely used for solving different problems in different areas [ 17 , 18 ] but, for the processing of images for health applications, its performance is remarkable. A lot of research exists in which CAD-based diagnosis of diseases is proposed.…”
Section: Related Workmentioning
confidence: 99%
“…To avoid vanishing gradient difficulty, a Rectified Linear Unit (ReLU) layer is also added after each convolution layer, as an element-wise activation function. Some other CNN layers are the input layer, the dropout layer, the output layer, and the network in network layer [ 17 , 18 ].…”
Section: Introductionmentioning
confidence: 99%
“…RNNs are suitable for temporal information classification as CNNs; however, owing to their feedback property, they can handle sequences like video frames in a better way comparatively. RNNs are computationally efficient; however, generally harder to train compared with CNN, thus requiring pre-extracted features or a CNN stage before the classification stage [20].…”
Section: Radial Bias Network 1988mentioning
confidence: 99%
“…LSTM is another deep network that shows considerably better results due to its memory cells, equation updating mechanism, and back-propagation refinements [16,17]. The architecture of LSTM makes it one of the most promising deep networks.…”
Section: Introductionmentioning
confidence: 99%