2003
DOI: 10.1007/s00521-003-0365-0
|View full text |Cite
|
Sign up to set email alerts
|

Simultaneous recurrent neural network trained with non-recurrent backpropagation algorithm for static optimisation

Abstract: This paper explores feasibility of employing the non-recurrent backpropagation training algorithm for a recurrent neural network, Simultaneous Recurrent Neural network, for static optimisation. A simplifying observation that maps the recurrent network dynamics, which is configured to operate in relaxation mode as a static optimizer, to feedforward network dynamics is leveraged to facilitate application of a non-recurrent training algorithm such as the standard backpropagation and its variants. A simulation stu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
4
0

Year Published

2008
2008
2016
2016

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 8 publications
1
4
0
Order By: Relevance
“…13 Therefore, the weight parameters were initialized to the values shown in Table 1 for the SRN trained using the resilient propagation training algorithm as well. More specifically, constraint weight parameters needed to be increased following a set schedule, which turned out to be critical for the SRN to locate a solution, which concurs with the findings of previous studies 13,15 In order to encourage the SRN algorithm to locate higher quality solutions, the increment value of the constraint weight parameter for the distance constraint was specified as large as possible compared to values for the remaining parameters while ensuring that the network converges to valid solutions. More specifically, constraint weight parameters needed to be increased following a set schedule, which turned out to be critical for the SRN to locate a solution, which concurs with the findings of previous studies 13,15 In order to encourage the SRN algorithm to locate higher quality solutions, the increment value of the constraint weight parameter for the distance constraint was specified as large as possible compared to values for the remaining parameters while ensuring that the network converges to valid solutions.…”
Section: Setup and Initialization For Tspsupporting
confidence: 73%
“…13 Therefore, the weight parameters were initialized to the values shown in Table 1 for the SRN trained using the resilient propagation training algorithm as well. More specifically, constraint weight parameters needed to be increased following a set schedule, which turned out to be critical for the SRN to locate a solution, which concurs with the findings of previous studies 13,15 In order to encourage the SRN algorithm to locate higher quality solutions, the increment value of the constraint weight parameter for the distance constraint was specified as large as possible compared to values for the remaining parameters while ensuring that the network converges to valid solutions. More specifically, constraint weight parameters needed to be increased following a set schedule, which turned out to be critical for the SRN to locate a solution, which concurs with the findings of previous studies 13,15 In order to encourage the SRN algorithm to locate higher quality solutions, the increment value of the constraint weight parameter for the distance constraint was specified as large as possible compared to values for the remaining parameters while ensuring that the network converges to valid solutions.…”
Section: Setup and Initialization For Tspsupporting
confidence: 73%
“…It is well-known that artificial neural network (ANN) can extract usable information from large quantity of discrete data with noise and is usable for resolving the highly non-linear and uncertain problems [14][15][16][17][18][19]. In general, there are two kinds of ANN, i.e., back-propagation artificial neural network (BPANN) and radial basis function artificial neural network (RBFANN), for being used to establish the model of the prediction.…”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…In addition, the Fuzzy evaluation can obtain better multi-objective evaluation, but it cannot be used to predict the performances of other tests. However, it is well-known that artificial neural network ANN can extract usable information from large quantity of discrete data with noise and is usable for resolving the highly non-linear and uncertain problems [20,21]. And then, we established three ANN models with four inputs (phosphoric acid, Al 2 O 3 , drying temperature and drying time) and one (either compression strength or tension strength) or two outputs (compression strength and tension strength) in Table 2.…”
Section: Artificial Neural Networkmentioning
confidence: 99%