2009
DOI: 10.1016/j.neucom.2008.11.011
|View full text |Cite
|
Sign up to set email alerts
|

Modular neural networks with Hebbian learning rule

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 17 publications
0
7
0
Order By: Relevance
“…[10, 20-23, 31, 32, 48, 51]). These models often use supervised neural network classifiers, such as [31-34, 37, 46], and may use associative (assembly) neural networks [17,18,[20][21][22][23].…”
Section: ])mentioning
confidence: 99%
“…[10, 20-23, 31, 32, 48, 51]). These models often use supervised neural network classifiers, such as [31-34, 37, 46], and may use associative (assembly) neural networks [17,18,[20][21][22][23].…”
Section: ])mentioning
confidence: 99%
“…The compared classifiers are support vector classifiers with radial basis functions (SVC-rbf) [1], support vector machine (SVM) [2,21], neural networks [7,9,33] and neuro-fuzzy classifier [30]. The results of this comparison are shown in Table 5 for MNIST dataset wherein the classifiers were compared according to the number of features (NF) and the percentage recognition rates (RR).…”
Section: Experimental Studiesmentioning
confidence: 99%
“…Among popular handwritten digit databases, the MNIST database has been widely used in recent years as a benchmark for testing new feature extraction and selection methods or for evaluating new classifiers [1][2][3][4][5][6][7][8][9][10]. In the literature, the best accuracies have been achieved for the MNIST database, such as 99.41 % for the robust visionbased features and classification schemes [3], 99.58 % for SVM with gradient features [1], 99.81 % for the convolutional neural networks and elastic distortion [4].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations