Proceedings of the 2008 ACM Symposium on Applied Computing 2008
DOI: 10.1145/1363686.1364103
|View full text |Cite
|
Sign up to set email alerts
|

Using the RRT algorithm to optimize classification systems for handwritten digits and letters

Abstract: Multi-objective genetic algorithms have been often used to optimize classification systems, but little is discussed on their computational cost to solve such problems. This paper optimizes a classification system with an annealing based approach, the Record-toRecord Travel algorithm. Results obtained are compared to those obtained with a multi-objective genetic algorithm in the same approach. Experiments are performed with isolated handwritten digits and uppercase letters, demonstrating both the effectiveness … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…In other works, letters from NIST SD 19 are also taken into account. For example, Radtke et al [89] optimized a classifier using an annealing-based approach called record-to-record travel reporting an accuracy of 93.78% for letters and 96.53% for digits. Koerich and Kalva [90] tested a multi-layer perceptron attaining an accuracy of 87.79% in the letters dataset.…”
Section: State Of the Artmentioning
confidence: 99%
“…In other works, letters from NIST SD 19 are also taken into account. For example, Radtke et al [89] optimized a classifier using an annealing-based approach called record-to-record travel reporting an accuracy of 93.78% for letters and 96.53% for digits. Koerich and Kalva [90] tested a multi-layer perceptron attaining an accuracy of 87.79% in the letters dataset.…”
Section: State Of the Artmentioning
confidence: 99%
“…The NIST dataset has been used occasionally in neural network systems. Many classifiers make use of only the digit classes [13], [14], whilst others tackle the letter classes as well [15]- [18]. Each paper tackles the task of formulating the classification tasks in a slightly different manner, varying such fundamental aspects as the number of classes to include, the training and testing splits, and the preprocessing of the images.…”
Section: Introductionmentioning
confidence: 99%
“…Other authors have used also letter from NIST Special Database 19. For example, Radtke et al [54] used recordto-record travel (accuracy of 96.53% for digits and 93.78% for letters), Koerich and Kalva [55] tested a multi-layer perceptron only with the letters dataset (accuracy of 87.79%. ), and Cavalin et al [56] used hidden Markov models (accuracy of 98% for digits and up to 90% for letters, though this result is only using uppercase letters, and the accuracy decreases to 87% when lowercase letters are also considered).…”
Section: The Emnist Database Emnist (Extended Mnist)mentioning
confidence: 99%