2014
DOI: 10.3906/elk-1202-89
|View full text |Cite
|
Sign up to set email alerts
|

Impact of small-world topology on the performance of a feed-forward artificial neural network based on 2 different real-life problems

Abstract: Abstract:Since feed-forward artificial neural networks (FFANNs) are the most widely used models to solve real-life problems, many studies have focused on improving their learning performances by changing the network architecture and learning algorithms. On the other hand, recently, small-world network topology has been shown to meet the characteristics of real-life problems. Therefore, in this study, instead of focusing on the performance of the conventional FFANNs, we investigated how real-life problems can b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
5
0
9

Year Published

2015
2015
2023
2023

Publication Types

Select...
8
1

Relationship

2
7

Authors

Journals

citations
Cited by 21 publications
(14 citation statements)
references
References 24 publications
0
5
0
9
Order By: Relevance
“…These findings seem to put into perspective previous results of advantages of a network with small world architecture 32 , 33 . However, it should be kept in mind that we have only considered recurrent networks, but not the modified version of a feed-forward network, as it was used in e. p. 34 . The size of the network must also be taken into account here.…”
Section: Resultsmentioning
confidence: 99%
“…These findings seem to put into perspective previous results of advantages of a network with small world architecture 32 , 33 . However, it should be kept in mind that we have only considered recurrent networks, but not the modified version of a feed-forward network, as it was used in e. p. 34 . The size of the network must also be taken into account here.…”
Section: Resultsmentioning
confidence: 99%
“…At last, the mathematical model determines the activation level of the neuron using the transfer function as shown in Figure 2. This is done once the activation level exceeds the threshold value [76,77]. Figure 2.…”
Section: Artificial Neural Network Modelmentioning
confidence: 99%
“…Biyolojik nöron ağlarının veriyi kodlayarak hücreden hücreye aktarmasına bezer şekilde, YSA mimarisi de katmanlar arası nöronların birbirleriyle kurdukları bağlantıların gücünü değiştirerek bilgiyi kodlar. Bu nedenle, bağlantıların sinaptik gücünün değiştirilmesi süreci yapay öğrenme olarak kabul edilmektedir [2][3][4]. Öğrenme sürecinin modellenmesinde özellikle İleri Yönlü YSA ağlarında; geri yayılım (Backpropagation, BP), Esnek geri yayılım (Resilient Back-propagation, Rprop) [4] ve Levenberg Marquard(LM) algoritmaları en bilinen algoritmalardır.…”
Section: Introductionunclassified
“…tarafından önerilen ve Erkaymaz ve arkadaşları[4,16], tarafından kapsamı geliştirilen, lokal katsayı (D Lokal ) ve global katsayıları (D Global ) kullanılmıştır. Burada; D Lokal , 1/C ile eşdeğer, D Global ise L parametresine eşdeğer kabul edilmektedir.…”
unclassified