Proceedings of International Conference on Neural Networks (ICNN'96)
DOI: 10.1109/icnn.1996.548946
|View full text |Cite
|
Sign up to set email alerts
|

A noise annealing neural network for global optimization

Abstract: This paper deals with a neural network model for global optimization. The model presented can solve nonlinear constrained optimization problems with continuous decision variables. Incorporating the noise annealing concept, the model is able to produce such a solution which is the global optima of the original task with probability close to 1. After a brief outline of some existing globally optimizing neural networks we introduce the stochastic neural model called noise annealing neural network which is based o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…Here, we present the results for further refinement of the technique in searching the absolute minimum and comparison of the performances. The stochastic neural network as described by Wong [13] and Biro et al [14], which can be utilized to enhance our global optimization, was obtained by adding Gaussian noise to the input of the Hopfield network. That is…”
Section: Description Of Stv and Results Of Particle Tracking Algorithmsmentioning
confidence: 99%
“…Here, we present the results for further refinement of the technique in searching the absolute minimum and comparison of the performances. The stochastic neural network as described by Wong [13] and Biro et al [14], which can be utilized to enhance our global optimization, was obtained by adding Gaussian noise to the input of the Hopfield network. That is…”
Section: Description Of Stv and Results Of Particle Tracking Algorithmsmentioning
confidence: 99%
“…Numerous techniques based on analogies to statistical physics are designed for avoiding local optima [14]- [16]. In the case of discrete optimization such a network is the mean field neural network.…”
Section: Introductionmentioning
confidence: 99%
“…While some researchers tried to find more appropriate parameters in the energy function (Aiyer et al, 1990;Baba, 1989;Biro et al, 1996;Burke, 1994;Gall & Zisssimopoulos, 1999;Gee & Prager, 1995;Hegde et al, 1988;Hopfield & Tank, 1985;Huang, 2005;Sharbaro, 1994;Wilson & Pawley, 1988), others hoped to get better energy functions (Baba, 1989). To date, research work has been extended to every aspect of the Hopfield model (Aarts & Laarhoven, 1985;Abe et al, 1992;Aiyer et al, 1990;Baba, 1989;Biro et al, 1996;Burke, 1994;Hegde et al, 1988;Hopfield & Tank, 1985;Sharbaro, 1994;Wilson & Pawley, 1988), and it is now clear how to correctly map problems onto the network so that invalid solutions never emerge. As for the quality of obtained solutions, while there are indications that the Hopfield model is solely suitable for solving Euclidean TSPs of small size (Wilson & Pawley, 1988), some researchers argue it is unreasonable to take the TSP as the benchmark to measure the optimization ability of the Hopfield model (Sharbaro, 1994).…”
Section: Hopfield Model For Solving Tsp Informationmentioning
confidence: 99%
“…In terms of the geometric structure of the distribution of the cities and the symmetry of distances between a pair of cities, the TSP can be classified into several categories (Aiyer et al, 1990;Baba, 1989;Biro et al, 1996;Burke, 1994;Gee & Prager, 1995;Hegde et al, 1988;Sharbaro, 1994;Wilson & Pawley, 1988). The Hopfield model for the TSP is built of n * n neurons.…”
Section: Hopfield Model To Solve the Tspmentioning
confidence: 99%
See 1 more Smart Citation