2001
DOI: 10.1016/s0031-3203(00)00043-1
|View full text |Cite
|
Sign up to set email alerts
|

Bootstrapping for efficient handwritten digit recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2003
2003
2012
2012

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(5 citation statements)
references
References 7 publications
0
5
0
Order By: Relevance
“…If we look for the k nearest patterns of a level only among some classes, the computational cost is reduced quite more. In [14] a bootstrapping algorithm to improve the classification accuracy and some algorithms to reduce the training set size are applied. The results obtained are compared with Hart's condensing algorithm and we can observe that when they reduce the training set size above Hart's condensing training set size, the hit rate is reduced a lot.…”
Section: ) (mentioning
confidence: 99%
“…If we look for the k nearest patterns of a level only among some classes, the computational cost is reduced quite more. In [14] a bootstrapping algorithm to improve the classification accuracy and some algorithms to reduce the training set size are applied. The results obtained are compared with Hart's condensing algorithm and we can observe that when they reduce the training set size above Hart's condensing training set size, the hit rate is reduced a lot.…”
Section: ) (mentioning
confidence: 99%
“…The ordinary bootstrapping is a method of resampling the given data and has been a successful method for error estimation [15][16][17][18]. The bootstrapping method that creates (not selects) new training samples was proposed by H a m a m o t o et al [1] that acts as a smoother of the distribution of the training samples and was successfully applied in the design of 1NN classifier, particularly in high dimensional spaces.…”
Section: Bootstrappingmentioning
confidence: 99%
“…The bootstrapping method that creates (not selects) new training samples was proposed by H a m a m o t o et al [1] that acts as a smoother of the distribution of the training samples and was successfully applied in the design of 1NN classifier, particularly in high dimensional spaces. Further, H a m a m o t o et al [1] generated bootstrap samples by combining the training data locally and illustrated that the NNC (Nearest Neighbour Classifier) based on bootstrap patterns performed better than that of K-NNC (K-nearest-neighbor classifier) based on the original data [18].…”
Section: Bootstrappingmentioning
confidence: 99%
See 1 more Smart Citation
“…Handwritten digits recognition is an important research area. Several techniques have been used to improve the recognition performance: contours [1], bootstrapping [2], neural networks [3], Singular Value Decomposition (SVD) [4], k-Nearest Neighbor (k-NN), Learning Vector Quantization (LVQ), Support Vector Classifiers (SVC) with Radial Base Function (RBF) kernel [5], Multi-Layer Perceptron (MLP) and LeNet [6], and holographic associative memories [7], among others.…”
Section: Introductionmentioning
confidence: 99%