2020
DOI: 10.1109/access.2019.2960161
|View full text |Cite
|
Sign up to set email alerts
|

A New Hybrid Convolutional Neural Network and eXtreme Gradient Boosting Classifier for Recognizing Handwritten Ethiopian Characters

Abstract: Handwritten character recognition has been profoundly studied for many years in the field of pattern recognition. Due to its vast practical applications and financial implications, the handwritten character recognition is still an important research area. In this research, a Handwritten Ethiopian Character Recognition (HECR) dataset is prepared to train a model. Images in the HECR dataset were organized with more than one color pen RGB main spaces that are size normalized to 28 × 28 pixels. The dataset is a co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 41 publications
(20 citation statements)
references
References 23 publications
0
20
0
Order By: Relevance
“…The Extreme Gradient Boosting (XGBoost) is a decision-tree-based ensemble machine learning algorithm that uses a gradient boosting framework. As we said before, an ensemble method is a machine learning technique that combines several base models to produce one optimal predictive model (Weldegebriel et al 2020). An algorithm is called boosting if it works by adding models on top of each other iteratively, the errors of the previous model are corrected by the next predictor until the training data is accurately predicted or reproduced by the model.…”
Section: Extreme Gradient Boosting Classifiermentioning
confidence: 99%
“…The Extreme Gradient Boosting (XGBoost) is a decision-tree-based ensemble machine learning algorithm that uses a gradient boosting framework. As we said before, an ensemble method is a machine learning technique that combines several base models to produce one optimal predictive model (Weldegebriel et al 2020). An algorithm is called boosting if it works by adding models on top of each other iteratively, the errors of the previous model are corrected by the next predictor until the training data is accurately predicted or reproduced by the model.…”
Section: Extreme Gradient Boosting Classifiermentioning
confidence: 99%
“…Among the learning-based methods, the convolutional neural network is a typical deep learning model. Due to its shared structure of local receptive fields and weights, the model has strong learning ability and does not require artificial prior knowledge [ 25 – 27 ]. Figure 1 shows hierarchical distribution of convolutional neural network.…”
Section: Construction Of a Classification Model Of Porcelain Fragment Texture Images Based On Convolutional Neural Networkmentioning
confidence: 99%
“…[1] where the authors use shallow NNs as base learners in the GBM. The same combination of the NN and XGBoost was proposed by Weldegebriel et al [30]. Ideas of the NN and GBM combination have been also studied by other authors, for example, [2,20].…”
Section: Introductionmentioning
confidence: 90%