2014 14th International Conference on Frontiers in Handwriting Recognition 2014
DOI: 10.1109/icfhr.2014.56
|View full text |Cite
|
Sign up to set email alerts
|

Handwritten Character Recognition by Alternately Trained Relaxation Convolutional Neural Network

Abstract: Deep learning methods have recently achieved impressive performance in the area of visual recognition and speech recognition. In this paper, we propose a handwriting recognition method based on relaxation convolutional neural network (R-CNN) and alternately trained relaxation convolutional neural network (ATR-CNN). Previous methods regularize CNN at full-connected layer or spatial-pooling layer, however, we focus on convolutional layer. The relaxation convolution layer adopted in our R-CNN, unlike traditional … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
46
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 105 publications
(46 citation statements)
references
References 23 publications
0
46
0
Order By: Relevance
“…We can see that the best accuracy of DLQDF is 94.44%, which is comparable to the deep CNN's 94.47% announced in [7], but still inferior to relaxation CNN's 95.04% [8]. In [8], weights sharing restriction of convolutional layers is relaxed to enhance the expressive capability, and at the same time, the number of parameters is largely increased. Unlike these deep CNNs, no extra synthesized samples are used for training in this experiment.…”
Section: Combining Statistical Correlation Expansion and Spatial Corrmentioning
confidence: 45%
See 4 more Smart Citations
“…We can see that the best accuracy of DLQDF is 94.44%, which is comparable to the deep CNN's 94.47% announced in [7], but still inferior to relaxation CNN's 95.04% [8]. In [8], weights sharing restriction of convolutional layers is relaxed to enhance the expressive capability, and at the same time, the number of parameters is largely increased. Unlike these deep CNNs, no extra synthesized samples are used for training in this experiment.…”
Section: Combining Statistical Correlation Expansion and Spatial Corrmentioning
confidence: 45%
“…Comparing Table 18 with Table 4, by combining DQFL and training set expansion, the test accuracies of NPC, MQDF and DLQDF are improved by 5.94%, 2.53% and 2.62%, respectively; and meanwhile, the accuracy gap between NPC and DLQDF is reduced from 4.08% to 0.76%, which demonstrates that quadratic features are more beneficial for linear classifiers than for quadratic classifiers. As shown in Table 19, on the ICDAR-2013 test set, the best accuracy of DLQDF reaches 94.92%, which is close to relaxation CNN's 95.04% [8], but significantly inferior to CNN committee's 96.06% [8]. When considering test speed, our system is much faster than CNN as analyzed in next subsection.…”
Section: Effect Of Training Set Expansionmentioning
confidence: 53%
See 3 more Smart Citations