1997
DOI: 10.1109/49.552065
|View full text |Cite
|
Sign up to set email alerts
|

Improved neural heuristics for multicast routing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
30
0

Year Published

2000
2000
2017
2017

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 82 publications
(30 citation statements)
references
References 25 publications
0
30
0
Order By: Relevance
“…Some of its other applications can be found in [1,3,4,9,12,25,26,89,[101][102][103]157,174] and several papers reviewing this subject can be found in the papers of the special issue in [49].…”
Section: The Random Neural Networkmentioning
confidence: 99%
“…Some of its other applications can be found in [1,3,4,9,12,25,26,89,[101][102][103]157,174] and several papers reviewing this subject can be found in the papers of the special issue in [49].…”
Section: The Random Neural Networkmentioning
confidence: 99%
“…Other applications of the random neural network that do not require learning include function optimization (Gelenbe, Koubi, and Pekergin [99]) and texture generation (Atalay and Gelenbe [9], Atalay, Gelenbe, and Yalabik [10]). Applications of the RNN were published for video compression (Cramer, Gelenbe, and Bakircioglu [20,21]), complex recognition tasks (Abdelbaki, Gelenbe, and El-Khamy [1], Abdelbaki, Gelenbe, and Kocak [2], Abdelbaki et al [3], Aguilar and Gelenbe [8], Gelenbe, Ghanwani, and Srinivasan [85], Hocaoglu et al [155]), and to the sensory search of patterns and objects (Gelenbe and Cao [74], Gelenbe and Koçak [97], Gelenbe, Koçak, and Wang [98]). A polynomial time-complexity learning algorithm for RNNs having soma-to-soma interactions was first presented in (Gelenbe and Timotheou [142]) and is further developed in (Wang and Gelenbe [184]).…”
Section: Extensions and Applications Of The Random Neural Network (Rnn)mentioning
confidence: 99%
“…In this way, an appropriate region for the optimal step-size is identified; then, a divide and conquer procedure is followed to find the largest value β m satisfying Eq. (24).…”
Section: Projected Gradient Non-negative Least-squares Algorithmmentioning
confidence: 99%
“…RNN has attracted a lot of attention in the scientific community due to its analytical solvability, excellent learning capacity, implementation ease, as well as its representational, modeling and universal approximation capabilities [19]. RNN has also been applied for the solution for different types of problems including optimization (e.g., minimum Steiner tree [24], assignment of assets to tasks under uncertainty [31], task assignment in distributed systems [2], rescuer assignment of emergency evacuation [25], cognitive packet networks [28]) and modeling (e.g., G-networks [3,18,20,22], gene regulatory networks [23], and protein interaction networks [42]) problems. Nonetheless, the most important application area of RNN regards the solution of supervised learning problems such as laser intensity vehicle classification [35], wafer surface reconstruction [27], mine detection [1] and denial-of-service attack detection [41].…”
Section: Introductionmentioning
confidence: 99%