2000
DOI: 10.1023/a:1008315023738
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…Therefore the first step to make in research like this is to locate the best ANN strategy for each problem. The more widespread approaches of machine learning included in this work are: MultiLayer Perceptron (MLP) [13,35], Radial Basis Function (RBF) [43,55], Support Vector Machines (SVM) [18,68], Recurrent Neural Networks (RNN) [33,50], Jordan-Elman Networks (ELN) [38,39] and Self Organizing Feature Maps (SOFM) [15,41]. Moreover, one interesting way to increase the performance of an ANN is to minimize the cost function during the training phase.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore the first step to make in research like this is to locate the best ANN strategy for each problem. The more widespread approaches of machine learning included in this work are: MultiLayer Perceptron (MLP) [13,35], Radial Basis Function (RBF) [43,55], Support Vector Machines (SVM) [18,68], Recurrent Neural Networks (RNN) [33,50], Jordan-Elman Networks (ELN) [38,39] and Self Organizing Feature Maps (SOFM) [15,41]. Moreover, one interesting way to increase the performance of an ANN is to minimize the cost function during the training phase.…”
Section: Introductionmentioning
confidence: 99%